Jan 20 08:16:00 np0005588920 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 20 08:16:00 np0005588920 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 20 08:16:00 np0005588920 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:00 np0005588920 kernel: BIOS-provided physical RAM map:
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 20 08:16:00 np0005588920 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 20 08:16:00 np0005588920 kernel: NX (Execute Disable) protection: active
Jan 20 08:16:00 np0005588920 kernel: APIC: Static calls initialized
Jan 20 08:16:00 np0005588920 kernel: SMBIOS 2.8 present.
Jan 20 08:16:00 np0005588920 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 20 08:16:00 np0005588920 kernel: Hypervisor detected: KVM
Jan 20 08:16:00 np0005588920 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 20 08:16:00 np0005588920 kernel: kvm-clock: using sched offset of 4026325199 cycles
Jan 20 08:16:00 np0005588920 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 20 08:16:00 np0005588920 kernel: tsc: Detected 2800.000 MHz processor
Jan 20 08:16:00 np0005588920 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 20 08:16:00 np0005588920 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 20 08:16:00 np0005588920 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 20 08:16:00 np0005588920 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 20 08:16:00 np0005588920 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 20 08:16:00 np0005588920 kernel: Using GB pages for direct mapping
Jan 20 08:16:00 np0005588920 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 20 08:16:00 np0005588920 kernel: ACPI: Early table checksum verification disabled
Jan 20 08:16:00 np0005588920 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 20 08:16:00 np0005588920 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:00 np0005588920 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:00 np0005588920 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:00 np0005588920 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 20 08:16:00 np0005588920 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:00 np0005588920 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:00 np0005588920 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 20 08:16:00 np0005588920 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 20 08:16:00 np0005588920 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 20 08:16:00 np0005588920 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 20 08:16:00 np0005588920 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 20 08:16:00 np0005588920 kernel: No NUMA configuration found
Jan 20 08:16:00 np0005588920 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 20 08:16:00 np0005588920 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 20 08:16:00 np0005588920 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 20 08:16:00 np0005588920 kernel: Zone ranges:
Jan 20 08:16:00 np0005588920 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 20 08:16:00 np0005588920 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 20 08:16:00 np0005588920 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 20 08:16:00 np0005588920 kernel:  Device   empty
Jan 20 08:16:00 np0005588920 kernel: Movable zone start for each node
Jan 20 08:16:00 np0005588920 kernel: Early memory node ranges
Jan 20 08:16:00 np0005588920 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 20 08:16:00 np0005588920 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 20 08:16:00 np0005588920 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 20 08:16:00 np0005588920 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 20 08:16:00 np0005588920 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 20 08:16:00 np0005588920 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 20 08:16:00 np0005588920 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 20 08:16:00 np0005588920 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 20 08:16:00 np0005588920 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 20 08:16:00 np0005588920 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 20 08:16:00 np0005588920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 20 08:16:00 np0005588920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 20 08:16:00 np0005588920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 20 08:16:00 np0005588920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 20 08:16:00 np0005588920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 20 08:16:00 np0005588920 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 20 08:16:00 np0005588920 kernel: TSC deadline timer available
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Max. logical packages:   8
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Max. logical dies:       8
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Max. dies per package:   1
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Max. threads per core:   1
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Num. cores per package:     1
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Num. threads per package:   1
Jan 20 08:16:00 np0005588920 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 20 08:16:00 np0005588920 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 20 08:16:00 np0005588920 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 20 08:16:00 np0005588920 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 20 08:16:00 np0005588920 kernel: Booting paravirtualized kernel on KVM
Jan 20 08:16:00 np0005588920 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 20 08:16:00 np0005588920 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 20 08:16:00 np0005588920 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 20 08:16:00 np0005588920 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 20 08:16:00 np0005588920 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:00 np0005588920 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 20 08:16:00 np0005588920 kernel: random: crng init done
Jan 20 08:16:00 np0005588920 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: Fallback order for Node 0: 0 
Jan 20 08:16:00 np0005588920 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 20 08:16:00 np0005588920 kernel: Policy zone: Normal
Jan 20 08:16:00 np0005588920 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 20 08:16:00 np0005588920 kernel: software IO TLB: area num 8.
Jan 20 08:16:00 np0005588920 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 20 08:16:00 np0005588920 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 20 08:16:00 np0005588920 kernel: ftrace: allocated 194 pages with 3 groups
Jan 20 08:16:00 np0005588920 kernel: Dynamic Preempt: voluntary
Jan 20 08:16:00 np0005588920 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 20 08:16:00 np0005588920 kernel: rcu: #011RCU event tracing is enabled.
Jan 20 08:16:00 np0005588920 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 20 08:16:00 np0005588920 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 20 08:16:00 np0005588920 kernel: #011Rude variant of Tasks RCU enabled.
Jan 20 08:16:00 np0005588920 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 20 08:16:00 np0005588920 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 20 08:16:00 np0005588920 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 20 08:16:00 np0005588920 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:00 np0005588920 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:00 np0005588920 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:00 np0005588920 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 20 08:16:00 np0005588920 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 20 08:16:00 np0005588920 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 20 08:16:00 np0005588920 kernel: Console: colour VGA+ 80x25
Jan 20 08:16:00 np0005588920 kernel: printk: console [ttyS0] enabled
Jan 20 08:16:00 np0005588920 kernel: ACPI: Core revision 20230331
Jan 20 08:16:00 np0005588920 kernel: APIC: Switch to symmetric I/O mode setup
Jan 20 08:16:00 np0005588920 kernel: x2apic enabled
Jan 20 08:16:00 np0005588920 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 20 08:16:00 np0005588920 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 20 08:16:00 np0005588920 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 20 08:16:00 np0005588920 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 20 08:16:00 np0005588920 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 20 08:16:00 np0005588920 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 20 08:16:00 np0005588920 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 20 08:16:00 np0005588920 kernel: Spectre V2 : Mitigation: Retpolines
Jan 20 08:16:00 np0005588920 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 20 08:16:00 np0005588920 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 20 08:16:00 np0005588920 kernel: RETBleed: Mitigation: untrained return thunk
Jan 20 08:16:00 np0005588920 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 20 08:16:00 np0005588920 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 20 08:16:00 np0005588920 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 20 08:16:00 np0005588920 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 20 08:16:00 np0005588920 kernel: x86/bugs: return thunk changed
Jan 20 08:16:00 np0005588920 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 20 08:16:00 np0005588920 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 20 08:16:00 np0005588920 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 20 08:16:00 np0005588920 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 20 08:16:00 np0005588920 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 20 08:16:00 np0005588920 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 20 08:16:00 np0005588920 kernel: Freeing SMP alternatives memory: 40K
Jan 20 08:16:00 np0005588920 kernel: pid_max: default: 32768 minimum: 301
Jan 20 08:16:00 np0005588920 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 20 08:16:00 np0005588920 kernel: landlock: Up and running.
Jan 20 08:16:00 np0005588920 kernel: Yama: becoming mindful.
Jan 20 08:16:00 np0005588920 kernel: SELinux:  Initializing.
Jan 20 08:16:00 np0005588920 kernel: LSM support for eBPF active
Jan 20 08:16:00 np0005588920 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 20 08:16:00 np0005588920 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 20 08:16:00 np0005588920 kernel: ... version:                0
Jan 20 08:16:00 np0005588920 kernel: ... bit width:              48
Jan 20 08:16:00 np0005588920 kernel: ... generic registers:      6
Jan 20 08:16:00 np0005588920 kernel: ... value mask:             0000ffffffffffff
Jan 20 08:16:00 np0005588920 kernel: ... max period:             00007fffffffffff
Jan 20 08:16:00 np0005588920 kernel: ... fixed-purpose events:   0
Jan 20 08:16:00 np0005588920 kernel: ... event mask:             000000000000003f
Jan 20 08:16:00 np0005588920 kernel: signal: max sigframe size: 1776
Jan 20 08:16:00 np0005588920 kernel: rcu: Hierarchical SRCU implementation.
Jan 20 08:16:00 np0005588920 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 20 08:16:00 np0005588920 kernel: smp: Bringing up secondary CPUs ...
Jan 20 08:16:00 np0005588920 kernel: smpboot: x86: Booting SMP configuration:
Jan 20 08:16:00 np0005588920 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 20 08:16:00 np0005588920 kernel: smp: Brought up 1 node, 8 CPUs
Jan 20 08:16:00 np0005588920 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 20 08:16:00 np0005588920 kernel: node 0 deferred pages initialised in 5ms
Jan 20 08:16:00 np0005588920 kernel: Memory: 7763956K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618364K reserved, 0K cma-reserved)
Jan 20 08:16:00 np0005588920 kernel: devtmpfs: initialized
Jan 20 08:16:00 np0005588920 kernel: x86/mm: Memory block size: 128MB
Jan 20 08:16:00 np0005588920 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 20 08:16:00 np0005588920 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 20 08:16:00 np0005588920 kernel: pinctrl core: initialized pinctrl subsystem
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 20 08:16:00 np0005588920 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 20 08:16:00 np0005588920 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 20 08:16:00 np0005588920 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 20 08:16:00 np0005588920 kernel: audit: initializing netlink subsys (disabled)
Jan 20 08:16:00 np0005588920 kernel: audit: type=2000 audit(1768914958.497:1): state=initialized audit_enabled=0 res=1
Jan 20 08:16:00 np0005588920 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 20 08:16:00 np0005588920 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 20 08:16:00 np0005588920 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 20 08:16:00 np0005588920 kernel: cpuidle: using governor menu
Jan 20 08:16:00 np0005588920 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 20 08:16:00 np0005588920 kernel: PCI: Using configuration type 1 for base access
Jan 20 08:16:00 np0005588920 kernel: PCI: Using configuration type 1 for extended access
Jan 20 08:16:00 np0005588920 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 20 08:16:00 np0005588920 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 20 08:16:00 np0005588920 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 20 08:16:00 np0005588920 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 20 08:16:00 np0005588920 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 20 08:16:00 np0005588920 kernel: Demotion targets for Node 0: null
Jan 20 08:16:00 np0005588920 kernel: cryptd: max_cpu_qlen set to 1000
Jan 20 08:16:00 np0005588920 kernel: ACPI: Added _OSI(Module Device)
Jan 20 08:16:00 np0005588920 kernel: ACPI: Added _OSI(Processor Device)
Jan 20 08:16:00 np0005588920 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 20 08:16:00 np0005588920 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 20 08:16:00 np0005588920 kernel: ACPI: Interpreter enabled
Jan 20 08:16:00 np0005588920 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 20 08:16:00 np0005588920 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 20 08:16:00 np0005588920 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 20 08:16:00 np0005588920 kernel: PCI: Using E820 reservations for host bridge windows
Jan 20 08:16:00 np0005588920 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 20 08:16:00 np0005588920 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [3] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [4] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [5] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [6] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [7] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [8] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [9] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [10] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [11] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [12] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [13] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [14] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [15] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [16] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [17] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [18] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [19] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [20] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [21] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [22] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [23] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [24] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [25] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [26] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [27] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [28] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [29] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [30] registered
Jan 20 08:16:00 np0005588920 kernel: acpiphp: Slot [31] registered
Jan 20 08:16:00 np0005588920 kernel: PCI host bridge to bus 0000:00
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 20 08:16:00 np0005588920 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 20 08:16:00 np0005588920 kernel: iommu: Default domain type: Translated
Jan 20 08:16:00 np0005588920 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 20 08:16:00 np0005588920 kernel: SCSI subsystem initialized
Jan 20 08:16:00 np0005588920 kernel: ACPI: bus type USB registered
Jan 20 08:16:00 np0005588920 kernel: usbcore: registered new interface driver usbfs
Jan 20 08:16:00 np0005588920 kernel: usbcore: registered new interface driver hub
Jan 20 08:16:00 np0005588920 kernel: usbcore: registered new device driver usb
Jan 20 08:16:00 np0005588920 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 20 08:16:00 np0005588920 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 20 08:16:00 np0005588920 kernel: PTP clock support registered
Jan 20 08:16:00 np0005588920 kernel: EDAC MC: Ver: 3.0.0
Jan 20 08:16:00 np0005588920 kernel: NetLabel: Initializing
Jan 20 08:16:00 np0005588920 kernel: NetLabel:  domain hash size = 128
Jan 20 08:16:00 np0005588920 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 20 08:16:00 np0005588920 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 20 08:16:00 np0005588920 kernel: PCI: Using ACPI for IRQ routing
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 20 08:16:00 np0005588920 kernel: vgaarb: loaded
Jan 20 08:16:00 np0005588920 kernel: clocksource: Switched to clocksource kvm-clock
Jan 20 08:16:00 np0005588920 kernel: VFS: Disk quotas dquot_6.6.0
Jan 20 08:16:00 np0005588920 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 20 08:16:00 np0005588920 kernel: pnp: PnP ACPI init
Jan 20 08:16:00 np0005588920 kernel: pnp: PnP ACPI: found 5 devices
Jan 20 08:16:00 np0005588920 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_INET protocol family
Jan 20 08:16:00 np0005588920 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 20 08:16:00 np0005588920 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_XDP protocol family
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 20 08:16:00 np0005588920 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 20 08:16:00 np0005588920 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 20 08:16:00 np0005588920 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 139971 usecs
Jan 20 08:16:00 np0005588920 kernel: PCI: CLS 0 bytes, default 64
Jan 20 08:16:00 np0005588920 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 20 08:16:00 np0005588920 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 20 08:16:00 np0005588920 kernel: ACPI: bus type thunderbolt registered
Jan 20 08:16:00 np0005588920 kernel: Trying to unpack rootfs image as initramfs...
Jan 20 08:16:00 np0005588920 kernel: Initialise system trusted keyrings
Jan 20 08:16:00 np0005588920 kernel: Key type blacklist registered
Jan 20 08:16:00 np0005588920 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 20 08:16:00 np0005588920 kernel: zbud: loaded
Jan 20 08:16:00 np0005588920 kernel: integrity: Platform Keyring initialized
Jan 20 08:16:00 np0005588920 kernel: integrity: Machine keyring initialized
Jan 20 08:16:00 np0005588920 kernel: Freeing initrd memory: 87956K
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_ALG protocol family
Jan 20 08:16:00 np0005588920 kernel: xor: automatically using best checksumming function   avx       
Jan 20 08:16:00 np0005588920 kernel: Key type asymmetric registered
Jan 20 08:16:00 np0005588920 kernel: Asymmetric key parser 'x509' registered
Jan 20 08:16:00 np0005588920 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 20 08:16:00 np0005588920 kernel: io scheduler mq-deadline registered
Jan 20 08:16:00 np0005588920 kernel: io scheduler kyber registered
Jan 20 08:16:00 np0005588920 kernel: io scheduler bfq registered
Jan 20 08:16:00 np0005588920 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 20 08:16:00 np0005588920 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 20 08:16:00 np0005588920 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 20 08:16:00 np0005588920 kernel: ACPI: button: Power Button [PWRF]
Jan 20 08:16:00 np0005588920 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 20 08:16:00 np0005588920 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 20 08:16:00 np0005588920 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 20 08:16:00 np0005588920 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 20 08:16:00 np0005588920 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 20 08:16:00 np0005588920 kernel: Non-volatile memory driver v1.3
Jan 20 08:16:00 np0005588920 kernel: rdac: device handler registered
Jan 20 08:16:00 np0005588920 kernel: hp_sw: device handler registered
Jan 20 08:16:00 np0005588920 kernel: emc: device handler registered
Jan 20 08:16:00 np0005588920 kernel: alua: device handler registered
Jan 20 08:16:00 np0005588920 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 20 08:16:00 np0005588920 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 20 08:16:00 np0005588920 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 20 08:16:00 np0005588920 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 20 08:16:00 np0005588920 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 20 08:16:00 np0005588920 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 20 08:16:00 np0005588920 kernel: usb usb1: Product: UHCI Host Controller
Jan 20 08:16:00 np0005588920 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 20 08:16:00 np0005588920 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 20 08:16:00 np0005588920 kernel: hub 1-0:1.0: USB hub found
Jan 20 08:16:00 np0005588920 kernel: hub 1-0:1.0: 2 ports detected
Jan 20 08:16:00 np0005588920 kernel: usbcore: registered new interface driver usbserial_generic
Jan 20 08:16:00 np0005588920 kernel: usbserial: USB Serial support registered for generic
Jan 20 08:16:00 np0005588920 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 20 08:16:00 np0005588920 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 20 08:16:00 np0005588920 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 20 08:16:00 np0005588920 kernel: mousedev: PS/2 mouse device common for all mice
Jan 20 08:16:00 np0005588920 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 20 08:16:00 np0005588920 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 20 08:16:00 np0005588920 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 20 08:16:00 np0005588920 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 20 08:16:00 np0005588920 kernel: rtc_cmos 00:04: registered as rtc0
Jan 20 08:16:00 np0005588920 kernel: rtc_cmos 00:04: setting system clock to 2026-01-20T13:15:59 UTC (1768914959)
Jan 20 08:16:00 np0005588920 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 20 08:16:00 np0005588920 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 20 08:16:00 np0005588920 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 20 08:16:00 np0005588920 kernel: usbcore: registered new interface driver usbhid
Jan 20 08:16:00 np0005588920 kernel: usbhid: USB HID core driver
Jan 20 08:16:00 np0005588920 kernel: drop_monitor: Initializing network drop monitor service
Jan 20 08:16:00 np0005588920 kernel: Initializing XFRM netlink socket
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_INET6 protocol family
Jan 20 08:16:00 np0005588920 kernel: Segment Routing with IPv6
Jan 20 08:16:00 np0005588920 kernel: NET: Registered PF_PACKET protocol family
Jan 20 08:16:00 np0005588920 kernel: mpls_gso: MPLS GSO support
Jan 20 08:16:00 np0005588920 kernel: IPI shorthand broadcast: enabled
Jan 20 08:16:00 np0005588920 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 20 08:16:00 np0005588920 kernel: AES CTR mode by8 optimization enabled
Jan 20 08:16:00 np0005588920 kernel: sched_clock: Marking stable (1518008250, 197224360)->(1856926850, -141694240)
Jan 20 08:16:00 np0005588920 kernel: registered taskstats version 1
Jan 20 08:16:00 np0005588920 kernel: Loading compiled-in X.509 certificates
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 20 08:16:00 np0005588920 kernel: Demotion targets for Node 0: null
Jan 20 08:16:00 np0005588920 kernel: page_owner is disabled
Jan 20 08:16:00 np0005588920 kernel: Key type .fscrypt registered
Jan 20 08:16:00 np0005588920 kernel: Key type fscrypt-provisioning registered
Jan 20 08:16:00 np0005588920 kernel: Key type big_key registered
Jan 20 08:16:00 np0005588920 kernel: Key type encrypted registered
Jan 20 08:16:00 np0005588920 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 20 08:16:00 np0005588920 kernel: Loading compiled-in module X.509 certificates
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 20 08:16:00 np0005588920 kernel: ima: Allocated hash algorithm: sha256
Jan 20 08:16:00 np0005588920 kernel: ima: No architecture policies found
Jan 20 08:16:00 np0005588920 kernel: evm: Initialising EVM extended attributes:
Jan 20 08:16:00 np0005588920 kernel: evm: security.selinux
Jan 20 08:16:00 np0005588920 kernel: evm: security.SMACK64 (disabled)
Jan 20 08:16:00 np0005588920 kernel: evm: security.SMACK64EXEC (disabled)
Jan 20 08:16:00 np0005588920 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 20 08:16:00 np0005588920 kernel: evm: security.SMACK64MMAP (disabled)
Jan 20 08:16:00 np0005588920 kernel: evm: security.apparmor (disabled)
Jan 20 08:16:00 np0005588920 kernel: evm: security.ima
Jan 20 08:16:00 np0005588920 kernel: evm: security.capability
Jan 20 08:16:00 np0005588920 kernel: evm: HMAC attrs: 0x1
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 20 08:16:00 np0005588920 kernel: Running certificate verification RSA selftest
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 20 08:16:00 np0005588920 kernel: Running certificate verification ECDSA selftest
Jan 20 08:16:00 np0005588920 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 20 08:16:00 np0005588920 kernel: clk: Disabling unused clocks
Jan 20 08:16:00 np0005588920 kernel: Freeing unused decrypted memory: 2028K
Jan 20 08:16:00 np0005588920 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 20 08:16:00 np0005588920 kernel: Write protecting the kernel read-only data: 30720k
Jan 20 08:16:00 np0005588920 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: Manufacturer: QEMU
Jan 20 08:16:00 np0005588920 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 20 08:16:00 np0005588920 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 20 08:16:00 np0005588920 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 20 08:16:00 np0005588920 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 20 08:16:00 np0005588920 kernel: Run /init as init process
Jan 20 08:16:00 np0005588920 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 20 08:16:00 np0005588920 systemd: Detected virtualization kvm.
Jan 20 08:16:00 np0005588920 systemd: Detected architecture x86-64.
Jan 20 08:16:00 np0005588920 systemd: Running in initrd.
Jan 20 08:16:00 np0005588920 systemd: No hostname configured, using default hostname.
Jan 20 08:16:00 np0005588920 systemd: Hostname set to <localhost>.
Jan 20 08:16:00 np0005588920 systemd: Initializing machine ID from VM UUID.
Jan 20 08:16:00 np0005588920 systemd: Queued start job for default target Initrd Default Target.
Jan 20 08:16:00 np0005588920 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:00 np0005588920 systemd: Reached target Local Encrypted Volumes.
Jan 20 08:16:00 np0005588920 systemd: Reached target Initrd /usr File System.
Jan 20 08:16:00 np0005588920 systemd: Reached target Local File Systems.
Jan 20 08:16:00 np0005588920 systemd: Reached target Path Units.
Jan 20 08:16:00 np0005588920 systemd: Reached target Slice Units.
Jan 20 08:16:00 np0005588920 systemd: Reached target Swaps.
Jan 20 08:16:00 np0005588920 systemd: Reached target Timer Units.
Jan 20 08:16:00 np0005588920 systemd: Listening on D-Bus System Message Bus Socket.
Jan 20 08:16:00 np0005588920 systemd: Listening on Journal Socket (/dev/log).
Jan 20 08:16:00 np0005588920 systemd: Listening on Journal Socket.
Jan 20 08:16:00 np0005588920 systemd: Listening on udev Control Socket.
Jan 20 08:16:00 np0005588920 systemd: Listening on udev Kernel Socket.
Jan 20 08:16:00 np0005588920 systemd: Reached target Socket Units.
Jan 20 08:16:00 np0005588920 systemd: Starting Create List of Static Device Nodes...
Jan 20 08:16:00 np0005588920 systemd: Starting Journal Service...
Jan 20 08:16:00 np0005588920 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 20 08:16:00 np0005588920 systemd: Starting Apply Kernel Variables...
Jan 20 08:16:00 np0005588920 systemd: Starting Create System Users...
Jan 20 08:16:00 np0005588920 systemd: Starting Setup Virtual Console...
Jan 20 08:16:00 np0005588920 systemd: Finished Create List of Static Device Nodes.
Jan 20 08:16:00 np0005588920 systemd: Finished Apply Kernel Variables.
Jan 20 08:16:00 np0005588920 systemd-journald[304]: Journal started
Jan 20 08:16:00 np0005588920 systemd-journald[304]: Runtime Journal (/run/log/journal/1190ec402b8943588dec733c5829fbed) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:00 np0005588920 systemd: Started Journal Service.
Jan 20 08:16:00 np0005588920 systemd-sysusers[309]: Creating group 'users' with GID 100.
Jan 20 08:16:00 np0005588920 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Jan 20 08:16:00 np0005588920 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 20 08:16:00 np0005588920 systemd[1]: Finished Create System Users.
Jan 20 08:16:00 np0005588920 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 20 08:16:00 np0005588920 systemd[1]: Starting Create Volatile Files and Directories...
Jan 20 08:16:00 np0005588920 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 20 08:16:00 np0005588920 systemd[1]: Finished Create Volatile Files and Directories.
Jan 20 08:16:00 np0005588920 systemd[1]: Finished Setup Virtual Console.
Jan 20 08:16:00 np0005588920 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 20 08:16:00 np0005588920 systemd[1]: Starting dracut cmdline hook...
Jan 20 08:16:00 np0005588920 dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Jan 20 08:16:00 np0005588920 dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:00 np0005588920 systemd[1]: Finished dracut cmdline hook.
Jan 20 08:16:00 np0005588920 systemd[1]: Starting dracut pre-udev hook...
Jan 20 08:16:00 np0005588920 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 20 08:16:00 np0005588920 kernel: device-mapper: uevent: version 1.0.3
Jan 20 08:16:00 np0005588920 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 20 08:16:00 np0005588920 kernel: RPC: Registered named UNIX socket transport module.
Jan 20 08:16:00 np0005588920 kernel: RPC: Registered udp transport module.
Jan 20 08:16:00 np0005588920 kernel: RPC: Registered tcp transport module.
Jan 20 08:16:00 np0005588920 kernel: RPC: Registered tcp-with-tls transport module.
Jan 20 08:16:00 np0005588920 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 20 08:16:00 np0005588920 rpc.statd[439]: Version 2.5.4 starting
Jan 20 08:16:00 np0005588920 rpc.statd[439]: Initializing NSM state
Jan 20 08:16:00 np0005588920 rpc.idmapd[444]: Setting log level to 0
Jan 20 08:16:00 np0005588920 systemd[1]: Finished dracut pre-udev hook.
Jan 20 08:16:00 np0005588920 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 20 08:16:00 np0005588920 systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Jan 20 08:16:00 np0005588920 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 20 08:16:00 np0005588920 systemd[1]: Starting dracut pre-trigger hook...
Jan 20 08:16:01 np0005588920 systemd[1]: Finished dracut pre-trigger hook.
Jan 20 08:16:01 np0005588920 systemd[1]: Starting Coldplug All udev Devices...
Jan 20 08:16:01 np0005588920 systemd[1]: Created slice Slice /system/modprobe.
Jan 20 08:16:01 np0005588920 systemd[1]: Starting Load Kernel Module configfs...
Jan 20 08:16:01 np0005588920 systemd[1]: Finished Coldplug All udev Devices.
Jan 20 08:16:01 np0005588920 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:01 np0005588920 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:01 np0005588920 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Network.
Jan 20 08:16:01 np0005588920 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 20 08:16:01 np0005588920 systemd[1]: Starting dracut initqueue hook...
Jan 20 08:16:01 np0005588920 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 20 08:16:01 np0005588920 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 20 08:16:01 np0005588920 systemd[1]: Mounting Kernel Configuration File System...
Jan 20 08:16:01 np0005588920 kernel: vda: vda1
Jan 20 08:16:01 np0005588920 systemd[1]: Mounted Kernel Configuration File System.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target System Initialization.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Basic System.
Jan 20 08:16:01 np0005588920 kernel: scsi host0: ata_piix
Jan 20 08:16:01 np0005588920 kernel: scsi host1: ata_piix
Jan 20 08:16:01 np0005588920 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 20 08:16:01 np0005588920 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 20 08:16:01 np0005588920 systemd-udevd[483]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:16:01 np0005588920 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Initrd Root Device.
Jan 20 08:16:01 np0005588920 kernel: ata1: found unknown device (class 0)
Jan 20 08:16:01 np0005588920 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 20 08:16:01 np0005588920 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 20 08:16:01 np0005588920 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 20 08:16:01 np0005588920 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 20 08:16:01 np0005588920 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 20 08:16:01 np0005588920 systemd[1]: Finished dracut initqueue hook.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 20 08:16:01 np0005588920 systemd[1]: Reached target Remote File Systems.
Jan 20 08:16:01 np0005588920 systemd[1]: Starting dracut pre-mount hook...
Jan 20 08:16:01 np0005588920 systemd[1]: Finished dracut pre-mount hook.
Jan 20 08:16:01 np0005588920 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 20 08:16:01 np0005588920 systemd-fsck[551]: /usr/sbin/fsck.xfs: XFS file system.
Jan 20 08:16:01 np0005588920 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 20 08:16:01 np0005588920 systemd[1]: Mounting /sysroot...
Jan 20 08:16:02 np0005588920 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 20 08:16:02 np0005588920 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 20 08:16:02 np0005588920 kernel: XFS (vda1): Ending clean mount
Jan 20 08:16:02 np0005588920 systemd[1]: Mounted /sysroot.
Jan 20 08:16:02 np0005588920 systemd[1]: Reached target Initrd Root File System.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 20 08:16:02 np0005588920 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 20 08:16:02 np0005588920 systemd[1]: Reached target Initrd File Systems.
Jan 20 08:16:02 np0005588920 systemd[1]: Reached target Initrd Default Target.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting dracut mount hook...
Jan 20 08:16:02 np0005588920 systemd[1]: Finished dracut mount hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 20 08:16:02 np0005588920 rpc.idmapd[444]: exiting on signal 15
Jan 20 08:16:02 np0005588920 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Network.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Timer Units.
Jan 20 08:16:02 np0005588920 systemd[1]: dbus.socket: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Initrd Default Target.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Basic System.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Initrd Root Device.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Initrd /usr File System.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Path Units.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Remote File Systems.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Slice Units.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Socket Units.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target System Initialization.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Local File Systems.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Swaps.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut mount hook.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut pre-mount hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut initqueue hook.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Apply Kernel Variables.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Coldplug All udev Devices.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut pre-trigger hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Setup Virtual Console.
Jan 20 08:16:02 np0005588920 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 20 08:16:02 np0005588920 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Closed udev Control Socket.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Closed udev Kernel Socket.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut pre-udev hook.
Jan 20 08:16:02 np0005588920 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped dracut cmdline hook.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting Cleanup udev Database...
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 20 08:16:02 np0005588920 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 20 08:16:02 np0005588920 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Stopped Create System Users.
Jan 20 08:16:02 np0005588920 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 20 08:16:02 np0005588920 systemd[1]: Finished Cleanup udev Database.
Jan 20 08:16:02 np0005588920 systemd[1]: Reached target Switch Root.
Jan 20 08:16:02 np0005588920 systemd[1]: Starting Switch Root...
Jan 20 08:16:02 np0005588920 systemd[1]: Switching root.
Jan 20 08:16:02 np0005588920 systemd-journald[304]: Journal stopped
Jan 20 08:16:03 np0005588920 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 20 08:16:03 np0005588920 kernel: audit: type=1404 audit(1768914962.575:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:16:03 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:16:03 np0005588920 kernel: audit: type=1403 audit(1768914962.702:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 20 08:16:03 np0005588920 systemd: Successfully loaded SELinux policy in 129.304ms.
Jan 20 08:16:03 np0005588920 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 29.799ms.
Jan 20 08:16:03 np0005588920 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 20 08:16:03 np0005588920 systemd: Detected virtualization kvm.
Jan 20 08:16:03 np0005588920 systemd: Detected architecture x86-64.
Jan 20 08:16:03 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:16:03 np0005588920 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd: Stopped Switch Root.
Jan 20 08:16:03 np0005588920 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 20 08:16:03 np0005588920 systemd: Created slice Slice /system/getty.
Jan 20 08:16:03 np0005588920 systemd: Created slice Slice /system/serial-getty.
Jan 20 08:16:03 np0005588920 systemd: Created slice Slice /system/sshd-keygen.
Jan 20 08:16:03 np0005588920 systemd: Created slice User and Session Slice.
Jan 20 08:16:03 np0005588920 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:03 np0005588920 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 20 08:16:03 np0005588920 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 20 08:16:03 np0005588920 systemd: Reached target Local Encrypted Volumes.
Jan 20 08:16:03 np0005588920 systemd: Stopped target Switch Root.
Jan 20 08:16:03 np0005588920 systemd: Stopped target Initrd File Systems.
Jan 20 08:16:03 np0005588920 systemd: Stopped target Initrd Root File System.
Jan 20 08:16:03 np0005588920 systemd: Reached target Local Integrity Protected Volumes.
Jan 20 08:16:03 np0005588920 systemd: Reached target Path Units.
Jan 20 08:16:03 np0005588920 systemd: Reached target rpc_pipefs.target.
Jan 20 08:16:03 np0005588920 systemd: Reached target Slice Units.
Jan 20 08:16:03 np0005588920 systemd: Reached target Swaps.
Jan 20 08:16:03 np0005588920 systemd: Reached target Local Verity Protected Volumes.
Jan 20 08:16:03 np0005588920 systemd: Listening on RPCbind Server Activation Socket.
Jan 20 08:16:03 np0005588920 systemd: Reached target RPC Port Mapper.
Jan 20 08:16:03 np0005588920 systemd: Listening on Process Core Dump Socket.
Jan 20 08:16:03 np0005588920 systemd: Listening on initctl Compatibility Named Pipe.
Jan 20 08:16:03 np0005588920 systemd: Listening on udev Control Socket.
Jan 20 08:16:03 np0005588920 systemd: Listening on udev Kernel Socket.
Jan 20 08:16:03 np0005588920 systemd: Mounting Huge Pages File System...
Jan 20 08:16:03 np0005588920 systemd: Mounting POSIX Message Queue File System...
Jan 20 08:16:03 np0005588920 systemd: Mounting Kernel Debug File System...
Jan 20 08:16:03 np0005588920 systemd: Mounting Kernel Trace File System...
Jan 20 08:16:03 np0005588920 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 20 08:16:03 np0005588920 systemd: Starting Create List of Static Device Nodes...
Jan 20 08:16:03 np0005588920 systemd: Starting Load Kernel Module configfs...
Jan 20 08:16:03 np0005588920 systemd: Starting Load Kernel Module drm...
Jan 20 08:16:03 np0005588920 systemd: Starting Load Kernel Module efi_pstore...
Jan 20 08:16:03 np0005588920 systemd: Starting Load Kernel Module fuse...
Jan 20 08:16:03 np0005588920 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 20 08:16:03 np0005588920 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd: Stopped File System Check on Root Device.
Jan 20 08:16:03 np0005588920 systemd: Stopped Journal Service.
Jan 20 08:16:03 np0005588920 systemd: Starting Journal Service...
Jan 20 08:16:03 np0005588920 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 20 08:16:03 np0005588920 systemd: Starting Generate network units from Kernel command line...
Jan 20 08:16:03 np0005588920 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:03 np0005588920 systemd: Starting Remount Root and Kernel File Systems...
Jan 20 08:16:03 np0005588920 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 20 08:16:03 np0005588920 systemd: Starting Apply Kernel Variables...
Jan 20 08:16:03 np0005588920 kernel: fuse: init (API version 7.37)
Jan 20 08:16:03 np0005588920 systemd: Starting Coldplug All udev Devices...
Jan 20 08:16:03 np0005588920 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 20 08:16:03 np0005588920 systemd: Mounted Huge Pages File System.
Jan 20 08:16:03 np0005588920 systemd-journald[675]: Journal started
Jan 20 08:16:03 np0005588920 systemd-journald[675]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:03 np0005588920 systemd[1]: Queued start job for default target Multi-User System.
Jan 20 08:16:03 np0005588920 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd: Started Journal Service.
Jan 20 08:16:03 np0005588920 systemd[1]: Mounted POSIX Message Queue File System.
Jan 20 08:16:03 np0005588920 systemd[1]: Mounted Kernel Debug File System.
Jan 20 08:16:03 np0005588920 systemd[1]: Mounted Kernel Trace File System.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Create List of Static Device Nodes.
Jan 20 08:16:03 np0005588920 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:03 np0005588920 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 20 08:16:03 np0005588920 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load Kernel Module fuse.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Generate network units from Kernel command line.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Apply Kernel Variables.
Jan 20 08:16:03 np0005588920 systemd[1]: Mounting FUSE Control File System...
Jan 20 08:16:03 np0005588920 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Rebuild Hardware Database...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 20 08:16:03 np0005588920 kernel: ACPI: bus type drm_connector registered
Jan 20 08:16:03 np0005588920 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Load/Save OS Random Seed...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Create System Users...
Jan 20 08:16:03 np0005588920 systemd-journald[675]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:03 np0005588920 systemd-journald[675]: Received client request to flush runtime journal.
Jan 20 08:16:03 np0005588920 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load Kernel Module drm.
Jan 20 08:16:03 np0005588920 systemd[1]: Mounted FUSE Control File System.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load/Save OS Random Seed.
Jan 20 08:16:03 np0005588920 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Create System Users.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Coldplug All udev Devices.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target Preparation for Local File Systems.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target Local File Systems.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 20 08:16:03 np0005588920 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 20 08:16:03 np0005588920 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 20 08:16:03 np0005588920 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Automatic Boot Loader Update...
Jan 20 08:16:03 np0005588920 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Create Volatile Files and Directories...
Jan 20 08:16:03 np0005588920 bootctl[692]: Couldn't find EFI system partition, skipping.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Automatic Boot Loader Update.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Create Volatile Files and Directories.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Security Auditing Service...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting RPC Bind...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Rebuild Journal Catalog...
Jan 20 08:16:03 np0005588920 auditd[698]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 20 08:16:03 np0005588920 auditd[698]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 20 08:16:03 np0005588920 systemd[1]: Started RPC Bind.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Rebuild Journal Catalog.
Jan 20 08:16:03 np0005588920 augenrules[703]: /sbin/augenrules: No change
Jan 20 08:16:03 np0005588920 augenrules[719]: No rules
Jan 20 08:16:03 np0005588920 augenrules[719]: enabled 1
Jan 20 08:16:03 np0005588920 augenrules[719]: failure 1
Jan 20 08:16:03 np0005588920 augenrules[719]: pid 698
Jan 20 08:16:03 np0005588920 augenrules[719]: rate_limit 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_limit 8192
Jan 20 08:16:03 np0005588920 augenrules[719]: lost 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time 60000
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time_actual 0
Jan 20 08:16:03 np0005588920 augenrules[719]: enabled 1
Jan 20 08:16:03 np0005588920 augenrules[719]: failure 1
Jan 20 08:16:03 np0005588920 augenrules[719]: pid 698
Jan 20 08:16:03 np0005588920 augenrules[719]: rate_limit 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_limit 8192
Jan 20 08:16:03 np0005588920 augenrules[719]: lost 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time 60000
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time_actual 0
Jan 20 08:16:03 np0005588920 augenrules[719]: enabled 1
Jan 20 08:16:03 np0005588920 augenrules[719]: failure 1
Jan 20 08:16:03 np0005588920 augenrules[719]: pid 698
Jan 20 08:16:03 np0005588920 augenrules[719]: rate_limit 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_limit 8192
Jan 20 08:16:03 np0005588920 augenrules[719]: lost 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog 0
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time 60000
Jan 20 08:16:03 np0005588920 augenrules[719]: backlog_wait_time_actual 0
Jan 20 08:16:03 np0005588920 systemd[1]: Started Security Auditing Service.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Rebuild Hardware Database.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Update is Completed...
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Update is Completed.
Jan 20 08:16:03 np0005588920 systemd-udevd[727]: Using default interface naming scheme 'rhel-9.0'.
Jan 20 08:16:03 np0005588920 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target System Initialization.
Jan 20 08:16:03 np0005588920 systemd[1]: Started dnf makecache --timer.
Jan 20 08:16:03 np0005588920 systemd[1]: Started Daily rotation of log files.
Jan 20 08:16:03 np0005588920 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target Timer Units.
Jan 20 08:16:03 np0005588920 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 20 08:16:03 np0005588920 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target Socket Units.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting D-Bus System Message Bus...
Jan 20 08:16:03 np0005588920 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Load Kernel Module configfs...
Jan 20 08:16:03 np0005588920 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:03 np0005588920 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:03 np0005588920 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 20 08:16:03 np0005588920 systemd[1]: Started D-Bus System Message Bus.
Jan 20 08:16:03 np0005588920 systemd[1]: Reached target Basic System.
Jan 20 08:16:03 np0005588920 dbus-broker-lau[735]: Ready
Jan 20 08:16:03 np0005588920 systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:16:03 np0005588920 systemd[1]: Starting NTP client/server...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 20 08:16:03 np0005588920 systemd[1]: Starting IPv4 firewall with iptables...
Jan 20 08:16:03 np0005588920 systemd[1]: Started irqbalance daemon.
Jan 20 08:16:04 np0005588920 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 20 08:16:04 np0005588920 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:04 np0005588920 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:04 np0005588920 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:04 np0005588920 systemd[1]: Reached target sshd-keygen.target.
Jan 20 08:16:04 np0005588920 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 20 08:16:04 np0005588920 systemd[1]: Reached target User and Group Name Lookups.
Jan 20 08:16:04 np0005588920 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 20 08:16:04 np0005588920 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 20 08:16:04 np0005588920 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 20 08:16:04 np0005588920 systemd[1]: Starting User Login Management...
Jan 20 08:16:04 np0005588920 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 20 08:16:04 np0005588920 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 20 08:16:04 np0005588920 chronyd[790]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 20 08:16:04 np0005588920 chronyd[790]: Loaded 0 symmetric keys
Jan 20 08:16:04 np0005588920 chronyd[790]: Using right/UTC timezone to obtain leap second data
Jan 20 08:16:04 np0005588920 chronyd[790]: Loaded seccomp filter (level 2)
Jan 20 08:16:04 np0005588920 systemd[1]: Started NTP client/server.
Jan 20 08:16:04 np0005588920 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 20 08:16:04 np0005588920 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 20 08:16:04 np0005588920 systemd-logind[783]: New seat seat0.
Jan 20 08:16:04 np0005588920 systemd[1]: Started User Login Management.
Jan 20 08:16:04 np0005588920 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 20 08:16:04 np0005588920 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 20 08:16:04 np0005588920 kernel: kvm_amd: TSC scaling supported
Jan 20 08:16:04 np0005588920 kernel: kvm_amd: Nested Virtualization enabled
Jan 20 08:16:04 np0005588920 kernel: kvm_amd: Nested Paging enabled
Jan 20 08:16:04 np0005588920 kernel: kvm_amd: LBR virtualization supported
Jan 20 08:16:04 np0005588920 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 20 08:16:04 np0005588920 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 20 08:16:04 np0005588920 kernel: Console: switching to colour dummy device 80x25
Jan 20 08:16:04 np0005588920 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 20 08:16:04 np0005588920 kernel: [drm] features: -context_init
Jan 20 08:16:04 np0005588920 kernel: [drm] number of scanouts: 1
Jan 20 08:16:04 np0005588920 kernel: [drm] number of cap sets: 0
Jan 20 08:16:04 np0005588920 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 20 08:16:04 np0005588920 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 20 08:16:04 np0005588920 kernel: Console: switching to colour frame buffer device 128x48
Jan 20 08:16:04 np0005588920 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 20 08:16:04 np0005588920 iptables.init[775]: iptables: Applying firewall rules: [  OK  ]
Jan 20 08:16:04 np0005588920 systemd[1]: Finished IPv4 firewall with iptables.
Jan 20 08:16:04 np0005588920 cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Tue, 20 Jan 2026 13:16:04 +0000. Up 6.47 seconds.
Jan 20 08:16:04 np0005588920 systemd[1]: run-cloud\x2dinit-tmp-tmpfrbkwjlg.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588920 systemd[1]: Starting Hostname Service...
Jan 20 08:16:04 np0005588920 systemd[1]: Started Hostname Service.
Jan 20 08:16:04 np0005588920 systemd-hostnamed[852]: Hostname set to <np0005588920.novalocal> (static)
Jan 20 08:16:04 np0005588920 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 20 08:16:04 np0005588920 systemd[1]: Reached target Preparation for Network.
Jan 20 08:16:04 np0005588920 systemd[1]: Starting Network Manager...
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0417] NetworkManager (version 1.54.3-2.el9) is starting... (boot:2aa8a071-ad9f-49e1-8122-1241f1c0c9d5)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0421] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0490] manager[0x561966946000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0535] hostname: hostname: using hostnamed
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0535] hostname: static hostname changed from (none) to "np0005588920.novalocal"
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0540] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0643] manager[0x561966946000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0644] manager[0x561966946000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0679] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0698] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0700] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0700] manager: Networking is enabled by state file
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0702] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0714] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0735] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0744] dhcp: init: Using DHCP client 'internal'
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0746] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:16:05 np0005588920 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0758] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0773] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0782] device (lo): Activation: starting connection 'lo' (31acf0d1-56f7-43b5-844e-a0d1773d997d)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0790] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0793] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0825] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0830] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0833] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0836] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0838] device (eth0): carrier: link connected
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0843] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0850] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0857] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0862] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0862] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0865] manager: NetworkManager state is now CONNECTING
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0867] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0875] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.0879] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:16:05 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:16:05 np0005588920 systemd[1]: Started Network Manager.
Jan 20 08:16:05 np0005588920 systemd[1]: Reached target Network.
Jan 20 08:16:05 np0005588920 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:16:05 np0005588920 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 20 08:16:05 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.1126] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.1131] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:16:05 np0005588920 NetworkManager[856]: <info>  [1768914965.1139] device (lo): Activation: successful, device activated.
Jan 20 08:16:05 np0005588920 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 20 08:16:05 np0005588920 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 20 08:16:05 np0005588920 systemd[1]: Reached target NFS client services.
Jan 20 08:16:05 np0005588920 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 20 08:16:05 np0005588920 systemd[1]: Reached target Remote File Systems.
Jan 20 08:16:05 np0005588920 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9547] dhcp4 (eth0): state changed new lease, address=38.102.83.30
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9558] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9582] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9618] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9620] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9623] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9625] device (eth0): Activation: successful, device activated.
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9630] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:16:06 np0005588920 NetworkManager[856]: <info>  [1768914966.9633] manager: startup complete
Jan 20 08:16:06 np0005588920 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:16:07 np0005588920 systemd[1]: Starting Cloud-init: Network Stage...
Jan 20 08:16:07 np0005588920 cloud-init[919]: Cloud-init v. 24.4-8.el9 running 'init' at Tue, 20 Jan 2026 13:16:07 +0000. Up 9.30 seconds.
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |  eth0  | True |         38.102.83.30         | 255.255.255.0 | global | fa:16:3e:48:28:5e |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fe48:285e/64 |       .       |  link  | fa:16:3e:48:28:5e |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 20 08:16:07 np0005588920 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:11 np0005588920 cloud-init[919]: Generating public/private rsa key pair.
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key fingerprint is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: SHA256:94sDRMRJjs+2Hx+9frgLwc/Cxv7kB+YUku3PKGqKHUU root@np0005588920.novalocal
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key's randomart image is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: +---[RSA 3072]----+
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       +o.       |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       o+        |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |      ... E  o   |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       o.. .o o  |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       .S o oo . |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       ..+ + ==  |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |        o.. O+=* |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       o oo*.B+.=|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |      . oo+o++B= |
Jan 20 08:16:11 np0005588920 cloud-init[919]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588920 cloud-init[919]: Generating public/private ecdsa key pair.
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key fingerprint is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: SHA256:5SPRUaLYYTfWwiidlPOGfckBtQCWgnhoKPWXmhPqjak root@np0005588920.novalocal
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key's randomart image is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: +---[ECDSA 256]---+
Jan 20 08:16:11 np0005588920 cloud-init[919]: | o.o . o*BO=o    |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |o +.o o*B*o=o.   |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |.. .o +o+=oo.o   |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |   . =  .++ +    |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |  . +   S.o.     |
Jan 20 08:16:11 np0005588920 cloud-init[919]: | . + .   . .     |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |  + .            |
Jan 20 08:16:11 np0005588920 cloud-init[919]: | .               |
Jan 20 08:16:11 np0005588920 cloud-init[919]: |E                |
Jan 20 08:16:11 np0005588920 cloud-init[919]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588920 cloud-init[919]: Generating public/private ed25519 key pair.
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 20 08:16:11 np0005588920 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key fingerprint is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: SHA256:7lDcUafa3sX4QyU0+NvwLoJTbCcmhOYyod1Wl1flZt8 root@np0005588920.novalocal
Jan 20 08:16:11 np0005588920 cloud-init[919]: The key's randomart image is:
Jan 20 08:16:11 np0005588920 cloud-init[919]: +--[ED25519 256]--+
Jan 20 08:16:11 np0005588920 cloud-init[919]: |            ..+ o|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |           ..+ o.|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |         .. .o..=|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |      ..o.o+o +*+|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |     o =Sooo...*E|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |    . +o+ ..*.+oo|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |      .+.  *.o.o.|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |       o  o . . o|
Jan 20 08:16:11 np0005588920 cloud-init[919]: |        .  . . . |
Jan 20 08:16:11 np0005588920 cloud-init[919]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588920 systemd[1]: Finished Cloud-init: Network Stage.
Jan 20 08:16:11 np0005588920 systemd[1]: Reached target Cloud-config availability.
Jan 20 08:16:11 np0005588920 sm-notify[1003]: Version 2.5.4 starting
Jan 20 08:16:11 np0005588920 systemd[1]: Reached target Network is Online.
Jan 20 08:16:11 np0005588920 systemd[1]: Starting Cloud-init: Config Stage...
Jan 20 08:16:11 np0005588920 systemd[1]: Starting Crash recovery kernel arming...
Jan 20 08:16:11 np0005588920 systemd[1]: Starting Notify NFS peers of a restart...
Jan 20 08:16:11 np0005588920 systemd[1]: Starting System Logging Service...
Jan 20 08:16:11 np0005588920 systemd[1]: Starting OpenSSH server daemon...
Jan 20 08:16:11 np0005588920 systemd[1]: Starting Permit User Sessions...
Jan 20 08:16:11 np0005588920 systemd[1]: Started Notify NFS peers of a restart.
Jan 20 08:16:11 np0005588920 systemd[1]: Finished Permit User Sessions.
Jan 20 08:16:11 np0005588920 systemd[1]: Started Command Scheduler.
Jan 20 08:16:11 np0005588920 systemd[1]: Started Getty on tty1.
Jan 20 08:16:11 np0005588920 systemd[1]: Started Serial Getty on ttyS0.
Jan 20 08:16:11 np0005588920 systemd[1]: Reached target Login Prompts.
Jan 20 08:16:11 np0005588920 systemd[1]: Started OpenSSH server daemon.
Jan 20 08:16:11 np0005588920 rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Jan 20 08:16:11 np0005588920 rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 20 08:16:11 np0005588920 systemd[1]: Started System Logging Service.
Jan 20 08:16:11 np0005588920 systemd[1]: Reached target Multi-User System.
Jan 20 08:16:11 np0005588920 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 20 08:16:11 np0005588920 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 20 08:16:11 np0005588920 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 20 08:16:11 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:16:11 np0005588920 kdumpctl[1017]: kdump: No kdump initial ramdisk found.
Jan 20 08:16:11 np0005588920 kdumpctl[1017]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 20 08:16:12 np0005588920 cloud-init[1131]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Tue, 20 Jan 2026 13:16:12 +0000. Up 14.02 seconds.
Jan 20 08:16:12 np0005588920 systemd[1]: Finished Cloud-init: Config Stage.
Jan 20 08:16:12 np0005588920 systemd[1]: Starting Cloud-init: Final Stage...
Jan 20 08:16:12 np0005588920 dracut[1265]: dracut-057-102.git20250818.el9
Jan 20 08:16:12 np0005588920 cloud-init[1282]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Tue, 20 Jan 2026 13:16:12 +0000. Up 14.40 seconds.
Jan 20 08:16:12 np0005588920 cloud-init[1285]: #############################################################
Jan 20 08:16:12 np0005588920 cloud-init[1286]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 20 08:16:12 np0005588920 cloud-init[1289]: 256 SHA256:5SPRUaLYYTfWwiidlPOGfckBtQCWgnhoKPWXmhPqjak root@np0005588920.novalocal (ECDSA)
Jan 20 08:16:12 np0005588920 cloud-init[1294]: 256 SHA256:7lDcUafa3sX4QyU0+NvwLoJTbCcmhOYyod1Wl1flZt8 root@np0005588920.novalocal (ED25519)
Jan 20 08:16:12 np0005588920 cloud-init[1299]: 3072 SHA256:94sDRMRJjs+2Hx+9frgLwc/Cxv7kB+YUku3PKGqKHUU root@np0005588920.novalocal (RSA)
Jan 20 08:16:12 np0005588920 cloud-init[1305]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 20 08:16:12 np0005588920 cloud-init[1307]: #############################################################
Jan 20 08:16:12 np0005588920 dracut[1268]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 20 08:16:12 np0005588920 cloud-init[1282]: Cloud-init v. 24.4-8.el9 finished at Tue, 20 Jan 2026 13:16:12 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 14.58 seconds
Jan 20 08:16:12 np0005588920 systemd[1]: Finished Cloud-init: Final Stage.
Jan 20 08:16:12 np0005588920 systemd[1]: Reached target Cloud-init target.
Jan 20 08:16:12 np0005588920 chronyd[790]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 20 08:16:12 np0005588920 chronyd[790]: System clock TAI offset set to 37 seconds
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: memstrack is not available
Jan 20 08:16:13 np0005588920 dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 20 08:16:13 np0005588920 dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 20 08:16:14 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 20 08:16:14 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 20 08:16:14 np0005588920 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 25 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 31 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 28 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 32 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 30 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 irqbalance[779]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 20 08:16:14 np0005588920 irqbalance[779]: IRQ 29 affinity is now unmanaged
Jan 20 08:16:14 np0005588920 dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 20 08:16:14 np0005588920 dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 20 08:16:14 np0005588920 dracut[1268]: memstrack is not available
Jan 20 08:16:14 np0005588920 dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 20 08:16:14 np0005588920 dracut[1268]: *** Including module: systemd ***
Jan 20 08:16:14 np0005588920 dracut[1268]: *** Including module: fips ***
Jan 20 08:16:14 np0005588920 dracut[1268]: *** Including module: systemd-initrd ***
Jan 20 08:16:14 np0005588920 dracut[1268]: *** Including module: i18n ***
Jan 20 08:16:15 np0005588920 dracut[1268]: *** Including module: drm ***
Jan 20 08:16:15 np0005588920 dracut[1268]: *** Including module: prefixdevname ***
Jan 20 08:16:15 np0005588920 dracut[1268]: *** Including module: kernel-modules ***
Jan 20 08:16:15 np0005588920 kernel: block vda: the capability attribute has been deprecated.
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: kernel-modules-extra ***
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: qemu ***
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: fstab-sys ***
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: rootfs-block ***
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: terminfo ***
Jan 20 08:16:16 np0005588920 dracut[1268]: *** Including module: udev-rules ***
Jan 20 08:16:17 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:16:17 np0005588920 dracut[1268]: Skipping udev rule: 91-permissions.rules
Jan 20 08:16:17 np0005588920 dracut[1268]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: virtiofs ***
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: dracut-systemd ***
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: usrmount ***
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: base ***
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: fs-lib ***
Jan 20 08:16:17 np0005588920 dracut[1268]: *** Including module: kdumpbase ***
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 20 08:16:18 np0005588920 dracut[1268]:  microcode_ctl module: mangling fw_dir
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 20 08:16:18 np0005588920 dracut[1268]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Including module: openssl ***
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Including module: shutdown ***
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Including module: squash ***
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Including modules done ***
Jan 20 08:16:18 np0005588920 dracut[1268]: *** Installing kernel module dependencies ***
Jan 20 08:16:19 np0005588920 dracut[1268]: *** Installing kernel module dependencies done ***
Jan 20 08:16:19 np0005588920 dracut[1268]: *** Resolving executable dependencies ***
Jan 20 08:16:21 np0005588920 dracut[1268]: *** Resolving executable dependencies done ***
Jan 20 08:16:21 np0005588920 dracut[1268]: *** Generating early-microcode cpio image ***
Jan 20 08:16:21 np0005588920 dracut[1268]: *** Store current command line parameters ***
Jan 20 08:16:21 np0005588920 dracut[1268]: Stored kernel commandline:
Jan 20 08:16:21 np0005588920 dracut[1268]: No dracut internal kernel commandline stored in the initramfs
Jan 20 08:16:21 np0005588920 dracut[1268]: *** Install squash loader ***
Jan 20 08:16:22 np0005588920 dracut[1268]: *** Squashing the files inside the initramfs ***
Jan 20 08:16:24 np0005588920 dracut[1268]: *** Squashing the files inside the initramfs done ***
Jan 20 08:16:24 np0005588920 dracut[1268]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 20 08:16:24 np0005588920 dracut[1268]: *** Hardlinking files ***
Jan 20 08:16:24 np0005588920 dracut[1268]: *** Hardlinking files done ***
Jan 20 08:16:24 np0005588920 dracut[1268]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 20 08:16:25 np0005588920 kdumpctl[1017]: kdump: kexec: loaded kdump kernel
Jan 20 08:16:25 np0005588920 kdumpctl[1017]: kdump: Starting kdump: [OK]
Jan 20 08:16:25 np0005588920 systemd[1]: Finished Crash recovery kernel arming.
Jan 20 08:16:25 np0005588920 systemd[1]: Startup finished in 1.890s (kernel) + 2.650s (initrd) + 23.125s (userspace) = 27.666s.
Jan 20 08:16:35 np0005588920 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:16:35 np0005588920 systemd[1]: Created slice User Slice of UID 1000.
Jan 20 08:16:35 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 20 08:16:35 np0005588920 systemd-logind[783]: New session 1 of user zuul.
Jan 20 08:16:35 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 20 08:16:35 np0005588920 systemd[1]: Starting User Manager for UID 1000...
Jan 20 08:16:35 np0005588920 systemd[4309]: Queued start job for default target Main User Target.
Jan 20 08:16:35 np0005588920 systemd[4309]: Created slice User Application Slice.
Jan 20 08:16:35 np0005588920 systemd[4309]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 08:16:35 np0005588920 systemd[4309]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 08:16:35 np0005588920 systemd[4309]: Reached target Paths.
Jan 20 08:16:35 np0005588920 systemd[4309]: Reached target Timers.
Jan 20 08:16:35 np0005588920 systemd[4309]: Starting D-Bus User Message Bus Socket...
Jan 20 08:16:35 np0005588920 systemd[4309]: Starting Create User's Volatile Files and Directories...
Jan 20 08:16:35 np0005588920 systemd[4309]: Listening on D-Bus User Message Bus Socket.
Jan 20 08:16:35 np0005588920 systemd[4309]: Finished Create User's Volatile Files and Directories.
Jan 20 08:16:35 np0005588920 systemd[4309]: Reached target Sockets.
Jan 20 08:16:35 np0005588920 systemd[4309]: Reached target Basic System.
Jan 20 08:16:35 np0005588920 systemd[4309]: Reached target Main User Target.
Jan 20 08:16:35 np0005588920 systemd[4309]: Startup finished in 137ms.
Jan 20 08:16:35 np0005588920 systemd[1]: Started User Manager for UID 1000.
Jan 20 08:16:35 np0005588920 systemd[1]: Started Session 1 of User zuul.
Jan 20 08:16:36 np0005588920 python3[4391]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:41 np0005588920 python3[4419]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:47 np0005588920 python3[4477]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:48 np0005588920 python3[4517]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 20 08:16:51 np0005588920 python3[4543]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/GVaHfRonG9ohfZBeZsfGsPAY5Ua/gRcfFAsYYpV+pfGGgyLPk7GpRkk4pr+e8jNRtdcfblMAicASH+5mJlHBm4eUbFYKtcwEXZXv6pyuCU3Ecns8qj50vHni0ryqqxTyg09WqOLv2u9xctOgas5b8y8tPl7bs2/uwlGFud/NxTxRMamezw0jUgKB9f6nJj6TiaAzomayQwqBx0/0kk8Cc6o4JsrOc92YyIsAjs+grfO5gO6MLYaAFWaCv28+Yvj3G37RUIAILUpORm4vyFNvxLGV+iIKd8ZYqqV6cczJ2tM7MGlfjYz9lTXL7WHkY2Knel8HDycvHH85Ydujv3gyD8d/m+dy4VHhMoU3HR1Syxx5e1GxOjU6NV7ZtEMjYtqE6zUdCNY1zXUU4uGxxPK7dF2Zzx5ODWpS7ssrJVRsLzDPf1YiIyi/g3OHzO95EzucQchqJsVh3MJI8D/C2CjI432eipKKcQAYY9sD9/mpPwBqI0PKwfSGTpsps60NwhM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:16:51 np0005588920 python3[4567]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:52 np0005588920 python3[4666]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:16:52 np0005588920 python3[4737]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915011.7325706-253-253938782777303/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=db0b5efc11684c95b5a6c3da9b48c4c5_id_rsa follow=False checksum=3ee7ffdf9f2bde9aa4c9d676d061c45199023a01 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:53 np0005588920 python3[4860]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:16:53 np0005588920 python3[4931]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915013.282561-308-257306104744870/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=db0b5efc11684c95b5a6c3da9b48c4c5_id_rsa.pub follow=False checksum=c665db3a39036994c79fbfd6a268cbf34e365958 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:55 np0005588920 python3[4979]: ansible-ping Invoked with data=pong
Jan 20 08:16:56 np0005588920 python3[5003]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:58 np0005588920 python3[5061]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 20 08:16:59 np0005588920 python3[5093]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:59 np0005588920 python3[5117]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588920 python3[5141]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588920 python3[5165]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588920 python3[5189]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588920 python3[5213]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:02 np0005588920 python3[5239]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:03 np0005588920 python3[5317]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:03 np0005588920 python3[5390]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915022.9566448-34-241495592573161/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:04 np0005588920 python3[5438]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:04 np0005588920 python3[5462]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588920 python3[5486]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588920 python3[5510]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588920 python3[5534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588920 python3[5558]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588920 python3[5582]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588920 python3[5606]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588920 python3[5630]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588920 python3[5654]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588920 python3[5678]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588920 python3[5702]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588920 python3[5726]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588920 python3[5750]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588920 python3[5774]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588920 python3[5798]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588920 python3[5822]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588920 python3[5846]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588920 python3[5870]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588920 python3[5894]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588920 python3[5918]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588920 python3[5942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588920 python3[5966]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588920 python3[5990]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:11 np0005588920 python3[6014]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:11 np0005588920 python3[6038]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:14 np0005588920 python3[6064]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 08:17:14 np0005588920 systemd[1]: Starting Time & Date Service...
Jan 20 08:17:14 np0005588920 systemd[1]: Started Time & Date Service.
Jan 20 08:17:14 np0005588920 systemd-timedated[6066]: Changed time zone to 'UTC' (UTC).
Jan 20 08:17:14 np0005588920 python3[6095]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:15 np0005588920 python3[6171]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:15 np0005588920 python3[6242]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1768915035.0869997-254-238946005259657/source _original_basename=tmpzh4jpehd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:16 np0005588920 python3[6342]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:16 np0005588920 python3[6413]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768915036.0098412-304-164864728297512/source _original_basename=tmp0v0hynep follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:17 np0005588920 python3[6515]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:17 np0005588920 python3[6588]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768915037.2362134-385-40747515029885/source _original_basename=tmpdq6ucx6g follow=False checksum=08e43183db11613368a518f4478807de9b29fcbd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:18 np0005588920 python3[6636]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:18 np0005588920 python3[6662]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:19 np0005588920 python3[6742]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:19 np0005588920 python3[6815]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915038.9879289-454-261337668269477/source _original_basename=tmpeztlo0bq follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:20 np0005588920 python3[6866]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-d383-642d-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:20 np0005588920 python3[6894]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d383-642d-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 20 08:17:22 np0005588920 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:44 np0005588920 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 08:17:49 np0005588920 python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:18:47 np0005588920 systemd[4309]: Starting Mark boot as successful...
Jan 20 08:18:47 np0005588920 systemd[4309]: Finished Mark boot as successful.
Jan 20 08:18:49 np0005588920 systemd-logind[783]: Session 1 logged out. Waiting for processes to exit.
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 20 08:19:21 np0005588920 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 20 08:19:21 np0005588920 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 20 08:19:21 np0005588920 NetworkManager[856]: <info>  [1768915161.9845] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:19:21 np0005588920 systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0142] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0171] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0174] device (eth1): carrier: link connected
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0176] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0181] policy: auto-activating connection 'Wired connection 1' (12674d35-716a-3518-802b-aa82435f07ea)
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0185] device (eth1): Activation: starting connection 'Wired connection 1' (12674d35-716a-3518-802b-aa82435f07ea)
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0185] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0188] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0191] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:19:22 np0005588920 NetworkManager[856]: <info>  [1768915162.0195] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:22 np0005588920 systemd-logind[783]: New session 3 of user zuul.
Jan 20 08:19:22 np0005588920 systemd[1]: Started Session 3 of User zuul.
Jan 20 08:19:23 np0005588920 python3[6983]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-d6d1-215a-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:19:33 np0005588920 python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:19:33 np0005588920 python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915172.8480334-206-40532671658359/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3df73a188f7c4e43ba837403bc427009e7fe9fc2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:19:34 np0005588920 python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:19:34 np0005588920 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 20 08:19:34 np0005588920 systemd[1]: Stopped Network Manager Wait Online.
Jan 20 08:19:34 np0005588920 systemd[1]: Stopping Network Manager Wait Online...
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.2873] caught SIGTERM, shutting down normally.
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.2885] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.2885] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.2885] dhcp4 (eth0): state changed no lease
Jan 20 08:19:34 np0005588920 systemd[1]: Stopping Network Manager...
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.2890] manager: NetworkManager state is now CONNECTING
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.3092] dhcp4 (eth1): canceled DHCP transaction
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.3093] dhcp4 (eth1): state changed no lease
Jan 20 08:19:34 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:19:34 np0005588920 NetworkManager[856]: <info>  [1768915174.3191] exiting (success)
Jan 20 08:19:34 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:19:34 np0005588920 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 20 08:19:34 np0005588920 systemd[1]: Stopped Network Manager.
Jan 20 08:19:34 np0005588920 systemd[1]: NetworkManager.service: Consumed 1.460s CPU time, 9.9M memory peak.
Jan 20 08:19:34 np0005588920 systemd[1]: Starting Network Manager...
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.3984] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:2aa8a071-ad9f-49e1-8122-1241f1c0c9d5)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.3987] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.4072] manager[0x5644f1e9d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:19:34 np0005588920 systemd[1]: Starting Hostname Service...
Jan 20 08:19:34 np0005588920 systemd[1]: Started Hostname Service.
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5233] hostname: hostname: using hostnamed
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5234] hostname: static hostname changed from (none) to "np0005588920.novalocal"
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5245] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5252] manager[0x5644f1e9d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5253] manager[0x5644f1e9d000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5297] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5298] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5299] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5299] manager: Networking is enabled by state file
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5303] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5308] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5353] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5368] dhcp: init: Using DHCP client 'internal'
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5373] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5380] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5387] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5399] device (lo): Activation: starting connection 'lo' (31acf0d1-56f7-43b5-844e-a0d1773d997d)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5410] device (eth0): carrier: link connected
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5417] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5425] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5426] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5436] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5445] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5454] device (eth1): carrier: link connected
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5460] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5467] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (12674d35-716a-3518-802b-aa82435f07ea) (indicated)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5468] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5476] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5489] device (eth1): Activation: starting connection 'Wired connection 1' (12674d35-716a-3518-802b-aa82435f07ea)
Jan 20 08:19:34 np0005588920 systemd[1]: Started Network Manager.
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5500] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5509] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5515] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5525] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5528] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5535] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5539] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5545] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5550] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5563] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5568] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5581] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5586] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5615] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5623] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5633] device (lo): Activation: successful, device activated.
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5645] dhcp4 (eth0): state changed new lease, address=38.102.83.30
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5658] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:19:34 np0005588920 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5739] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5767] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5769] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5775] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5779] device (eth0): Activation: successful, device activated.
Jan 20 08:19:34 np0005588920 NetworkManager[7197]: <info>  [1768915174.5787] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:19:34 np0005588920 python3[7271]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-d6d1-215a-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:19:44 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:20:04 np0005588920 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0327] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:20:20 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:20:20 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0684] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0687] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0698] device (eth1): Activation: successful, device activated.
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0709] manager: startup complete
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0711] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <warn>  [1768915220.0718] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0731] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0873] dhcp4 (eth1): canceled DHCP transaction
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0874] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0874] dhcp4 (eth1): state changed no lease
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0901] policy: auto-activating connection 'ci-private-network' (3d69abfd-ede9-58d4-966b-b715c8218c1f)
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0910] device (eth1): Activation: starting connection 'ci-private-network' (3d69abfd-ede9-58d4-966b-b715c8218c1f)
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0912] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0920] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0933] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.0949] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.1011] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.1015] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:20:20 np0005588920 NetworkManager[7197]: <info>  [1768915220.1029] device (eth1): Activation: successful, device activated.
Jan 20 08:20:30 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:20:34 np0005588920 systemd[1]: session-3.scope: Deactivated successfully.
Jan 20 08:20:34 np0005588920 systemd[1]: session-3.scope: Consumed 1.826s CPU time.
Jan 20 08:20:35 np0005588920 systemd-logind[783]: Session 3 logged out. Waiting for processes to exit.
Jan 20 08:20:35 np0005588920 systemd-logind[783]: Removed session 3.
Jan 20 08:20:53 np0005588920 systemd-logind[783]: New session 4 of user zuul.
Jan 20 08:20:53 np0005588920 systemd[1]: Started Session 4 of User zuul.
Jan 20 08:20:53 np0005588920 python3[7380]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:20:53 np0005588920 python3[7453]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915253.2160122-373-79759045082816/source _original_basename=tmpl075vdvz follow=False checksum=a06d82404ae9ae38c6111e54a4021096121ff7ef backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:20:56 np0005588920 systemd[1]: session-4.scope: Deactivated successfully.
Jan 20 08:20:56 np0005588920 systemd-logind[783]: Session 4 logged out. Waiting for processes to exit.
Jan 20 08:20:56 np0005588920 systemd-logind[783]: Removed session 4.
Jan 20 08:21:47 np0005588920 systemd[4309]: Created slice User Background Tasks Slice.
Jan 20 08:21:47 np0005588920 systemd[4309]: Starting Cleanup of User's Temporary Files and Directories...
Jan 20 08:21:47 np0005588920 systemd[4309]: Finished Cleanup of User's Temporary Files and Directories.
Jan 20 08:26:22 np0005588920 systemd-logind[783]: New session 5 of user zuul.
Jan 20 08:26:22 np0005588920 systemd[1]: Started Session 5 of User zuul.
Jan 20 08:26:22 np0005588920 python3[7514]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-67c0-97af-000000000ca4-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:23 np0005588920 python3[7542]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588920 python3[7569]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588920 python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588920 python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:24 np0005588920 python3[7647]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:24 np0005588920 python3[7725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:26:25 np0005588920 python3[7798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915584.613151-367-119465264417294/source _original_basename=tmp04ut3gdk follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:26 np0005588920 python3[7848]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 08:26:26 np0005588920 systemd[1]: Reloading.
Jan 20 08:26:26 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:26:28 np0005588920 python3[7905]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 20 08:26:28 np0005588920 python3[7931]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588920 python3[7959]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588920 python3[7987]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588920 python3[8015]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:30 np0005588920 python3[8042]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-67c0-97af-000000000cab-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:30 np0005588920 python3[8072]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 20 08:26:34 np0005588920 systemd[1]: session-5.scope: Deactivated successfully.
Jan 20 08:26:34 np0005588920 systemd[1]: session-5.scope: Consumed 4.583s CPU time.
Jan 20 08:26:34 np0005588920 systemd-logind[783]: Session 5 logged out. Waiting for processes to exit.
Jan 20 08:26:34 np0005588920 systemd-logind[783]: Removed session 5.
Jan 20 08:26:36 np0005588920 systemd-logind[783]: New session 6 of user zuul.
Jan 20 08:26:36 np0005588920 systemd[1]: Started Session 6 of User zuul.
Jan 20 08:26:36 np0005588920 python3[8105]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 20 08:26:42 np0005588920 setsebool[8144]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 20 08:26:42 np0005588920 setsebool[8144]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 20 08:26:56 np0005588920 kernel: SELinux:  Converting 385 SID table entries...
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:26:56 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  Converting 388 SID table entries...
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:27:06 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:27:25 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 20 08:27:25 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:27:25 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:27:25 np0005588920 systemd[1]: Reloading.
Jan 20 08:27:25 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:27:25 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:27:30 np0005588920 python3[12839]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-45b7-a25c-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:27:31 np0005588920 kernel: evm: overlay not supported
Jan 20 08:27:31 np0005588920 systemd[4309]: Starting D-Bus User Message Bus...
Jan 20 08:27:31 np0005588920 dbus-broker-launch[13900]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 20 08:27:31 np0005588920 dbus-broker-launch[13900]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 20 08:27:31 np0005588920 systemd[4309]: Started D-Bus User Message Bus.
Jan 20 08:27:31 np0005588920 dbus-broker-lau[13900]: Ready
Jan 20 08:27:32 np0005588920 systemd[4309]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 20 08:27:32 np0005588920 systemd[4309]: Created slice Slice /user.
Jan 20 08:27:32 np0005588920 systemd[4309]: podman-13690.scope: unit configures an IP firewall, but not running as root.
Jan 20 08:27:32 np0005588920 systemd[4309]: (This warning is only shown for the first unit using IP firewalling.)
Jan 20 08:27:32 np0005588920 systemd[4309]: Started podman-13690.scope.
Jan 20 08:27:32 np0005588920 systemd[4309]: Started podman-pause-f7c82347.scope.
Jan 20 08:27:32 np0005588920 python3[14017]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.233:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.233:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:27:32 np0005588920 python3[14017]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 20 08:27:33 np0005588920 systemd[1]: session-6.scope: Deactivated successfully.
Jan 20 08:27:33 np0005588920 systemd[1]: session-6.scope: Consumed 47.902s CPU time.
Jan 20 08:27:33 np0005588920 systemd-logind[783]: Session 6 logged out. Waiting for processes to exit.
Jan 20 08:27:33 np0005588920 systemd-logind[783]: Removed session 6.
Jan 20 08:27:58 np0005588920 systemd-logind[783]: New session 7 of user zuul.
Jan 20 08:27:58 np0005588920 systemd[1]: Started Session 7 of User zuul.
Jan 20 08:27:58 np0005588920 python3[23053]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:27:59 np0005588920 python3[23220]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:28:00 np0005588920 python3[23577]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005588920.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 20 08:28:00 np0005588920 python3[23844]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:28:01 np0005588920 python3[24066]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:28:01 np0005588920 python3[24291]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915681.0519648-170-80924639533268/source _original_basename=tmprxouqzz2 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:28:02 np0005588920 python3[24615]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 20 08:28:02 np0005588920 systemd[1]: Starting Hostname Service...
Jan 20 08:28:02 np0005588920 systemd[1]: Started Hostname Service.
Jan 20 08:28:02 np0005588920 systemd-hostnamed[24735]: Changed pretty hostname to 'compute-2'
Jan 20 08:28:02 np0005588920 systemd-hostnamed[24735]: Hostname set to <compute-2> (static)
Jan 20 08:28:02 np0005588920 NetworkManager[7197]: <info>  [1768915682.9249] hostname: static hostname changed from "np0005588920.novalocal" to "compute-2"
Jan 20 08:28:02 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:28:02 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:28:03 np0005588920 systemd[1]: session-7.scope: Deactivated successfully.
Jan 20 08:28:03 np0005588920 systemd[1]: session-7.scope: Consumed 2.595s CPU time.
Jan 20 08:28:03 np0005588920 systemd-logind[783]: Session 7 logged out. Waiting for processes to exit.
Jan 20 08:28:03 np0005588920 systemd-logind[783]: Removed session 7.
Jan 20 08:28:12 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:28:14 np0005588920 irqbalance[779]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 20 08:28:14 np0005588920 irqbalance[779]: IRQ 27 affinity is now unmanaged
Jan 20 08:28:19 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:28:19 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:28:19 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 1min 4.169s CPU time.
Jan 20 08:28:19 np0005588920 systemd[1]: run-rd08babcdb4714c789810d199ad56b42b.service: Deactivated successfully.
Jan 20 08:28:32 np0005588920 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:31:09 np0005588920 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 20 08:31:09 np0005588920 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 20 08:31:09 np0005588920 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 20 08:31:09 np0005588920 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 20 08:33:00 np0005588920 systemd-logind[783]: New session 8 of user zuul.
Jan 20 08:33:00 np0005588920 systemd[1]: Started Session 8 of User zuul.
Jan 20 08:33:01 np0005588920 python3[30073]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:33:03 np0005588920 python3[30191]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:03 np0005588920 python3[30264]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:03 np0005588920 python3[30290]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:04 np0005588920 python3[30363]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:04 np0005588920 python3[30389]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:04 np0005588920 python3[30462]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:05 np0005588920 python3[30488]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:05 np0005588920 python3[30562]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:05 np0005588920 python3[30588]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:06 np0005588920 python3[30661]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:06 np0005588920 python3[30687]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:06 np0005588920 python3[30760]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:07 np0005588920 python3[30786]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:07 np0005588920 python3[30859]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8869324-34068-244627387978555/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:20 np0005588920 python3[30910]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:38:20 np0005588920 systemd[1]: session-8.scope: Deactivated successfully.
Jan 20 08:38:20 np0005588920 systemd[1]: session-8.scope: Consumed 5.488s CPU time.
Jan 20 08:38:20 np0005588920 systemd-logind[783]: Session 8 logged out. Waiting for processes to exit.
Jan 20 08:38:20 np0005588920 systemd-logind[783]: Removed session 8.
Jan 20 08:45:43 np0005588920 systemd-logind[783]: New session 9 of user zuul.
Jan 20 08:45:43 np0005588920 systemd[1]: Started Session 9 of User zuul.
Jan 20 08:45:45 np0005588920 python3.9[31180]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:45:46 np0005588920 python3.9[31361]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:45:54 np0005588920 systemd[1]: session-9.scope: Deactivated successfully.
Jan 20 08:45:54 np0005588920 systemd[1]: session-9.scope: Consumed 8.410s CPU time.
Jan 20 08:45:54 np0005588920 systemd-logind[783]: Session 9 logged out. Waiting for processes to exit.
Jan 20 08:45:54 np0005588920 systemd-logind[783]: Removed session 9.
Jan 20 08:46:10 np0005588920 systemd-logind[783]: New session 10 of user zuul.
Jan 20 08:46:10 np0005588920 systemd[1]: Started Session 10 of User zuul.
Jan 20 08:46:11 np0005588920 python3.9[31571]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 20 08:46:12 np0005588920 python3.9[31745]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:13 np0005588920 python3.9[31897]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:46:14 np0005588920 python3.9[32050]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:46:15 np0005588920 python3.9[32202]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:17 np0005588920 python3.9[32354]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:46:17 np0005588920 python3.9[32477]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768916776.5661323-179-121178272426620/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:18 np0005588920 python3.9[32629]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:19 np0005588920 python3.9[32785]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:46:20 np0005588920 python3.9[32937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:46:21 np0005588920 python3.9[33087]: ansible-ansible.builtin.service_facts Invoked
Jan 20 08:46:27 np0005588920 python3.9[33342]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:28 np0005588920 python3.9[33492]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:30 np0005588920 python3.9[33646]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:31 np0005588920 python3.9[33804]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:46:32 np0005588920 python3.9[33888]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:47:18 np0005588920 systemd[1]: Reloading.
Jan 20 08:47:18 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:18 np0005588920 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 20 08:47:19 np0005588920 systemd[1]: Reloading.
Jan 20 08:47:19 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:19 np0005588920 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 20 08:47:19 np0005588920 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 20 08:47:19 np0005588920 systemd[1]: Reloading.
Jan 20 08:47:19 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:19 np0005588920 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 20 08:47:20 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 08:47:20 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 08:47:20 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 08:48:25 np0005588920 kernel: SELinux:  Converting 2723 SID table entries...
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:48:25 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:48:25 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 20 08:48:25 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:48:25 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:48:25 np0005588920 systemd[1]: Reloading.
Jan 20 08:48:26 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:48:26 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:48:27 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:48:27 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:48:27 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 1.736s CPU time.
Jan 20 08:48:27 np0005588920 systemd[1]: run-r2e0ea576c11c45b6b0a842f7692b592c.service: Deactivated successfully.
Jan 20 08:48:39 np0005588920 python3.9[35409]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:48:41 np0005588920 python3.9[35692]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 20 08:48:42 np0005588920 python3.9[35844]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 20 08:48:46 np0005588920 python3.9[35997]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:48:47 np0005588920 python3.9[36150]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 20 08:48:48 np0005588920 python3.9[36302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:48:53 np0005588920 python3.9[36455]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:48:56 np0005588920 python3.9[36578]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768916932.6838446-669-204262568392128/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:48:58 np0005588920 python3.9[36730]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:48:58 np0005588920 python3.9[36882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:48:59 np0005588920 python3.9[37035]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:49:01 np0005588920 python3.9[37187]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 20 08:49:01 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:49:01 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:49:02 np0005588920 python3.9[37341]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:03 np0005588920 python3.9[37499]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 08:49:04 np0005588920 python3.9[37659]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 20 08:49:05 np0005588920 python3.9[37812]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:06 np0005588920 python3.9[37970]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 20 08:49:07 np0005588920 python3.9[38122]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:49:09 np0005588920 python3.9[38275]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:10 np0005588920 python3.9[38427]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:49:11 np0005588920 python3.9[38550]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768916950.0662718-1026-231263438342757/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:12 np0005588920 python3.9[38702]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:49:12 np0005588920 systemd[1]: Starting Load Kernel Modules...
Jan 20 08:49:12 np0005588920 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 20 08:49:12 np0005588920 kernel: Bridge firewalling registered
Jan 20 08:49:12 np0005588920 systemd-modules-load[38706]: Inserted module 'br_netfilter'
Jan 20 08:49:12 np0005588920 systemd[1]: Finished Load Kernel Modules.
Jan 20 08:49:13 np0005588920 python3.9[38862]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:49:14 np0005588920 irqbalance[779]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 20 08:49:14 np0005588920 irqbalance[779]: IRQ 26 affinity is now unmanaged
Jan 20 08:49:14 np0005588920 python3.9[38985]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768916952.7562964-1095-188974398382877/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:15 np0005588920 python3.9[39137]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:49:18 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 08:49:18 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 08:49:19 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:49:19 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:49:19 np0005588920 systemd[1]: Reloading.
Jan 20 08:49:19 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:19 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:49:21 np0005588920 python3.9[41379]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:49:22 np0005588920 python3.9[42416]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 20 08:49:23 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:49:23 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:49:23 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 4.831s CPU time.
Jan 20 08:49:23 np0005588920 systemd[1]: run-rba5a1c5f2d2b40a599367d90a6c78a99.service: Deactivated successfully.
Jan 20 08:49:23 np0005588920 python3.9[43155]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:49:24 np0005588920 python3.9[43307]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:24 np0005588920 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 08:49:24 np0005588920 systemd[1]: Starting Authorization Manager...
Jan 20 08:49:24 np0005588920 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 08:49:24 np0005588920 polkitd[43524]: Started polkitd version 0.117
Jan 20 08:49:24 np0005588920 systemd[1]: Started Authorization Manager.
Jan 20 08:49:26 np0005588920 python3.9[43694]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:26 np0005588920 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 20 08:49:26 np0005588920 systemd[1]: tuned.service: Deactivated successfully.
Jan 20 08:49:26 np0005588920 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 20 08:49:26 np0005588920 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 08:49:26 np0005588920 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 08:49:27 np0005588920 python3.9[43856]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 20 08:49:31 np0005588920 python3.9[44008]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:31 np0005588920 systemd[1]: Reloading.
Jan 20 08:49:31 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:32 np0005588920 python3.9[44197]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:32 np0005588920 systemd[1]: Reloading.
Jan 20 08:49:32 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:32 np0005588920 systemd[1]: Starting dnf makecache...
Jan 20 08:49:32 np0005588920 dnf[44234]: Failed determining last makecache time.
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-barbican-42b4c41831408a8e323 117 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 198 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-cinder-1c00d6490d88e436f26ef 188 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-stevedore-c4acc5639fd2329372142 191 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-cloudkitty-tests-tempest-2c80f8 198 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-os-refresh-config-9bfc52b5049be2d8de61 168 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 149 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-designate-tests-tempest-347fdbc 195 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-glance-1fd12c29b339f30fe823e 189 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 185 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-manila-3c01b7181572c95dac462 186 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-whitebox-neutron-tests-tempest- 176 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-octavia-ba397f07a7331190208c 207 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-watcher-c014f81a8647287f6dcc 195 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-ansible-config_template-5ccaa22121a7ff 167 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 175 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-openstack-swift-dc98a8463506ac520c469a 202 kB/s | 3.0 kB     00:00
Jan 20 08:49:32 np0005588920 dnf[44234]: delorean-python-tempestconf-8515371b7cceebd4282 206 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: delorean-openstack-heat-ui-013accbfd179753bc3f0 184 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: CentOS Stream 9 - BaseOS                         67 kB/s | 6.4 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: CentOS Stream 9 - AppStream                      73 kB/s | 6.8 kB     00:00
Jan 20 08:49:33 np0005588920 python3.9[44406]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:33 np0005588920 dnf[44234]: CentOS Stream 9 - CRB                            60 kB/s | 6.3 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: CentOS Stream 9 - Extras packages                75 kB/s | 7.3 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: dlrn-antelope-testing                           104 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: dlrn-antelope-build-deps                        188 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: centos9-rabbitmq                                110 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: centos9-storage                                 141 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: centos9-opstools                                138 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: NFV SIG OpenvSwitch                             136 kB/s | 3.0 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: repo-setup-centos-appstream                     194 kB/s | 4.4 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: repo-setup-centos-baseos                        168 kB/s | 3.9 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: repo-setup-centos-highavailability              179 kB/s | 3.9 kB     00:00
Jan 20 08:49:33 np0005588920 dnf[44234]: repo-setup-centos-powertools                    207 kB/s | 4.3 kB     00:00
Jan 20 08:49:34 np0005588920 python3.9[44576]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:34 np0005588920 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 20 08:49:34 np0005588920 dnf[44234]: Extra Packages for Enterprise Linux 9 - x86_64  238 kB/s |  32 kB     00:00
Jan 20 08:49:34 np0005588920 dnf[44234]: Metadata cache created.
Jan 20 08:49:34 np0005588920 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 20 08:49:34 np0005588920 systemd[1]: Finished dnf makecache.
Jan 20 08:49:34 np0005588920 systemd[1]: dnf-makecache.service: Consumed 1.641s CPU time.
Jan 20 08:49:34 np0005588920 python3.9[44734]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:37 np0005588920 python3.9[44896]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:38 np0005588920 python3.9[45049]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:49:38 np0005588920 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 20 08:49:38 np0005588920 systemd[1]: Stopped Apply Kernel Variables.
Jan 20 08:49:38 np0005588920 systemd[1]: Stopping Apply Kernel Variables...
Jan 20 08:49:38 np0005588920 systemd[1]: Starting Apply Kernel Variables...
Jan 20 08:49:38 np0005588920 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 20 08:49:38 np0005588920 systemd[1]: Finished Apply Kernel Variables.
Jan 20 08:49:38 np0005588920 systemd[1]: session-10.scope: Deactivated successfully.
Jan 20 08:49:38 np0005588920 systemd[1]: session-10.scope: Consumed 2min 23.503s CPU time.
Jan 20 08:49:38 np0005588920 systemd-logind[783]: Session 10 logged out. Waiting for processes to exit.
Jan 20 08:49:38 np0005588920 systemd-logind[783]: Removed session 10.
Jan 20 08:49:44 np0005588920 systemd-logind[783]: New session 11 of user zuul.
Jan 20 08:49:44 np0005588920 systemd[1]: Started Session 11 of User zuul.
Jan 20 08:49:45 np0005588920 python3.9[45232]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:49:46 np0005588920 python3.9[45388]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 20 08:49:47 np0005588920 python3.9[45541]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:49 np0005588920 python3.9[45699]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 08:49:50 np0005588920 python3.9[45859]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:49:51 np0005588920 python3.9[45945]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 08:49:54 np0005588920 python3.9[46109]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:06 np0005588920 kernel: SELinux:  Converting 2736 SID table entries...
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:50:06 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:50:07 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 20 08:50:07 np0005588920 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 20 08:50:08 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:08 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:08 np0005588920 systemd[1]: Reloading.
Jan 20 08:50:08 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:08 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:08 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:09 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:09 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:09 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 1.000s CPU time.
Jan 20 08:50:09 np0005588920 systemd[1]: run-r549428f5daa04d19a9e8acd3ce4e3bbe.service: Deactivated successfully.
Jan 20 08:50:11 np0005588920 python3.9[47207]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 08:50:12 np0005588920 systemd[1]: Reloading.
Jan 20 08:50:12 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:12 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:12 np0005588920 systemd[1]: Starting Open vSwitch Database Unit...
Jan 20 08:50:12 np0005588920 chown[47250]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 20 08:50:12 np0005588920 ovs-ctl[47255]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 20 08:50:12 np0005588920 ovs-ctl[47255]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 20 08:50:12 np0005588920 ovs-ctl[47255]: Starting ovsdb-server [  OK  ]
Jan 20 08:50:12 np0005588920 ovs-vsctl[47304]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 20 08:50:12 np0005588920 ovs-vsctl[47320]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"7c9bfe4c-7684-437c-a64a-33562743d048\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 20 08:50:12 np0005588920 ovs-ctl[47255]: Configuring Open vSwitch system IDs [  OK  ]
Jan 20 08:50:12 np0005588920 ovs-ctl[47255]: Enabling remote OVSDB managers [  OK  ]
Jan 20 08:50:12 np0005588920 systemd[1]: Started Open vSwitch Database Unit.
Jan 20 08:50:12 np0005588920 ovs-vsctl[47330]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 20 08:50:12 np0005588920 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 20 08:50:12 np0005588920 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 20 08:50:12 np0005588920 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 20 08:50:12 np0005588920 kernel: openvswitch: Open vSwitch switching datapath
Jan 20 08:50:12 np0005588920 ovs-ctl[47374]: Inserting openvswitch module [  OK  ]
Jan 20 08:50:12 np0005588920 ovs-ctl[47343]: Starting ovs-vswitchd [  OK  ]
Jan 20 08:50:12 np0005588920 ovs-vsctl[47392]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 20 08:50:12 np0005588920 ovs-ctl[47343]: Enabling remote OVSDB managers [  OK  ]
Jan 20 08:50:12 np0005588920 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 20 08:50:12 np0005588920 systemd[1]: Starting Open vSwitch...
Jan 20 08:50:12 np0005588920 systemd[1]: Finished Open vSwitch.
Jan 20 08:50:14 np0005588920 python3.9[47543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:50:15 np0005588920 python3.9[47695]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 20 08:50:16 np0005588920 kernel: SELinux:  Converting 2750 SID table entries...
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:50:16 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:50:17 np0005588920 python3.9[47850]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:50:18 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 20 08:50:18 np0005588920 python3.9[48008]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:21 np0005588920 python3.9[48161]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:50:22 np0005588920 python3.9[48448]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 20 08:50:24 np0005588920 python3.9[48598]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:50:25 np0005588920 python3.9[48752]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:26 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:26 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:26 np0005588920 systemd[1]: Reloading.
Jan 20 08:50:27 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:27 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:27 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:27 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:27 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:27 np0005588920 systemd[1]: run-r78b231009327404793364888d0d10cc6.service: Deactivated successfully.
Jan 20 08:50:28 np0005588920 python3.9[49070]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:50:28 np0005588920 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 20 08:50:28 np0005588920 systemd[1]: Stopped Network Manager Wait Online.
Jan 20 08:50:28 np0005588920 systemd[1]: Stopping Network Manager Wait Online...
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.4997] caught SIGTERM, shutting down normally.
Jan 20 08:50:28 np0005588920 systemd[1]: Stopping Network Manager...
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.5008] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.5009] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.5009] dhcp4 (eth0): state changed no lease
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.5010] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:50:28 np0005588920 NetworkManager[7197]: <info>  [1768917028.5084] exiting (success)
Jan 20 08:50:28 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:50:28 np0005588920 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 20 08:50:28 np0005588920 systemd[1]: Stopped Network Manager.
Jan 20 08:50:28 np0005588920 systemd[1]: NetworkManager.service: Consumed 12.845s CPU time, 4.1M memory peak, read 0B from disk, written 22.0K to disk.
Jan 20 08:50:28 np0005588920 systemd[1]: Starting Network Manager...
Jan 20 08:50:28 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.5657] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:2aa8a071-ad9f-49e1-8122-1241f1c0c9d5)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.5660] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.5735] manager[0x561ac594a000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:50:28 np0005588920 systemd[1]: Starting Hostname Service...
Jan 20 08:50:28 np0005588920 systemd[1]: Started Hostname Service.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6671] hostname: hostname: using hostnamed
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6671] hostname: static hostname changed from (none) to "compute-2"
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6676] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6681] manager[0x561ac594a000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6681] manager[0x561ac594a000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6714] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6729] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6729] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6730] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6732] manager: Networking is enabled by state file
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6735] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6742] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6772] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6779] dhcp: init: Using DHCP client 'internal'
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6782] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6786] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6789] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6795] device (lo): Activation: starting connection 'lo' (31acf0d1-56f7-43b5-844e-a0d1773d997d)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6800] device (eth0): carrier: link connected
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6804] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6806] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6807] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6811] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6816] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6820] device (eth1): carrier: link connected
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6823] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6827] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (3d69abfd-ede9-58d4-966b-b715c8218c1f) (indicated)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6827] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6831] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6836] device (eth1): Activation: starting connection 'ci-private-network' (3d69abfd-ede9-58d4-966b-b715c8218c1f)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6840] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:50:28 np0005588920 systemd[1]: Started Network Manager.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6852] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6855] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6857] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6859] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6862] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6864] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6867] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6871] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6878] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6884] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6895] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6909] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6916] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6919] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6925] device (lo): Activation: successful, device activated.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6940] dhcp4 (eth0): state changed new lease, address=38.102.83.30
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.6946] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:50:28 np0005588920 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7009] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7014] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7021] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7026] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7028] device (eth1): Activation: successful, device activated.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7040] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7042] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7046] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7049] device (eth0): Activation: successful, device activated.
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7054] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:50:28 np0005588920 NetworkManager[49076]: <info>  [1768917028.7058] manager: startup complete
Jan 20 08:50:28 np0005588920 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:50:29 np0005588920 python3.9[49296]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:34 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:34 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:34 np0005588920 systemd[1]: Reloading.
Jan 20 08:50:34 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:34 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:34 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:35 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:35 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:35 np0005588920 systemd[1]: run-r0297bece05494fb88d3387f24c84cc3e.service: Deactivated successfully.
Jan 20 08:50:36 np0005588920 python3.9[49754]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:50:38 np0005588920 python3.9[49906]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:38 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:50:39 np0005588920 python3.9[50060]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:39 np0005588920 python3.9[50212]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:40 np0005588920 python3.9[50364]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:41 np0005588920 python3.9[50516]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:42 np0005588920 python3.9[50668]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:50:42 np0005588920 python3.9[50791]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917041.6494558-649-235462159012595/.source _original_basename=.peefvftv follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:43 np0005588920 python3.9[50943]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:44 np0005588920 python3.9[51095]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 20 08:50:45 np0005588920 python3.9[51247]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:47 np0005588920 python3.9[51674]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 20 08:50:49 np0005588920 ansible-async_wrapper.py[51849]: Invoked with j414307377384 300 /home/zuul/.ansible/tmp/ansible-tmp-1768917048.25072-847-235989445929811/AnsiballZ_edpm_os_net_config.py _
Jan 20 08:50:49 np0005588920 ansible-async_wrapper.py[51852]: Starting module and watcher
Jan 20 08:50:49 np0005588920 ansible-async_wrapper.py[51852]: Start watching 51853 (300)
Jan 20 08:50:49 np0005588920 ansible-async_wrapper.py[51853]: Start module (51853)
Jan 20 08:50:49 np0005588920 ansible-async_wrapper.py[51849]: Return async_wrapper task started.
Jan 20 08:50:49 np0005588920 python3.9[51854]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 20 08:50:50 np0005588920 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 20 08:50:50 np0005588920 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 20 08:50:50 np0005588920 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 20 08:50:50 np0005588920 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 20 08:50:50 np0005588920 kernel: cfg80211: failed to load regulatory.db
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.1699] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.1720] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2174] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2177] audit: op="connection-add" uuid="074b808b-f7cb-4f42-8d75-1cfe1913bcb2" name="br-ex-br" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2192] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2194] audit: op="connection-add" uuid="26809b5b-3322-47c2-a73a-78258098dd62" name="br-ex-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2205] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2207] audit: op="connection-add" uuid="1c79297b-659f-4362-8327-0bf384eb756a" name="eth1-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2217] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2218] audit: op="connection-add" uuid="6498b403-e9be-41f5-8deb-866a023de3e6" name="vlan20-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2229] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2230] audit: op="connection-add" uuid="1f5c4641-0b75-4d16-a78f-dfd24d45a03d" name="vlan21-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2240] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2242] audit: op="connection-add" uuid="18676ada-e7e0-497c-bb73-00d61429b17c" name="vlan22-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2251] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2253] audit: op="connection-add" uuid="ac8102e0-f26f-48eb-bda7-61f007a921f6" name="vlan23-port" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2270] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2285] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2287] audit: op="connection-add" uuid="e9e3fe7a-7ac4-430b-91bf-1afe17384089" name="br-ex-if" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2329] audit: op="connection-update" uuid="3d69abfd-ede9-58d4-966b-b715c8218c1f" name="ci-private-network" args="ovs-external-ids.data,connection.controller,connection.master,connection.port-type,connection.timestamp,connection.slave-type,ipv4.never-default,ipv4.routes,ipv4.dns,ipv4.method,ipv4.routing-rules,ipv4.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.addresses,ovs-interface.type" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2344] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2346] audit: op="connection-add" uuid="c9125976-f777-48a9-965b-972b8c7051e1" name="vlan20-if" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2359] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2361] audit: op="connection-add" uuid="1d31e934-5f63-4c38-92d5-eca61baa3f44" name="vlan21-if" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2376] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2377] audit: op="connection-add" uuid="cd9d450f-f4a5-479f-8202-320803fcb77a" name="vlan22-if" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2392] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2394] audit: op="connection-add" uuid="d9db681a-595b-4331-8be5-e7d27d1f646d" name="vlan23-if" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2404] audit: op="connection-delete" uuid="12674d35-716a-3518-802b-aa82435f07ea" name="Wired connection 1" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2415] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2419] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2425] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2429] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (074b808b-f7cb-4f42-8d75-1cfe1913bcb2)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2430] audit: op="connection-activate" uuid="074b808b-f7cb-4f42-8d75-1cfe1913bcb2" name="br-ex-br" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2432] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2433] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2439] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2443] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (26809b5b-3322-47c2-a73a-78258098dd62)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2445] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2448] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2453] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2457] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1c79297b-659f-4362-8327-0bf384eb756a)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2459] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2460] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2465] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2469] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (6498b403-e9be-41f5-8deb-866a023de3e6)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2471] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2472] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2477] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2481] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (1f5c4641-0b75-4d16-a78f-dfd24d45a03d)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2483] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2485] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2489] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2493] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (18676ada-e7e0-497c-bb73-00d61429b17c)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2496] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2497] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2502] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2506] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (ac8102e0-f26f-48eb-bda7-61f007a921f6)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2507] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2510] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2512] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2519] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2520] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2524] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2528] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e9e3fe7a-7ac4-430b-91bf-1afe17384089)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2530] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2533] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2535] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2537] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2538] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2548] device (eth1): disconnecting for new activation request.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2549] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2552] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2554] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2556] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2559] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2560] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2563] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2567] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (c9125976-f777-48a9-965b-972b8c7051e1)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2568] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2571] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2573] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2575] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2578] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2580] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2583] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2587] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (1d31e934-5f63-4c38-92d5-eca61baa3f44)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2589] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2591] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2593] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2595] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2598] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2599] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2602] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2606] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (cd9d450f-f4a5-479f-8202-320803fcb77a)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2607] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2611] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2613] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2614] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2617] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <warn>  [1768917051.2619] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2622] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2626] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (d9db681a-595b-4331-8be5-e7d27d1f646d)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2627] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2631] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2633] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2635] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2637] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2649] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2651] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2655] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2659] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2665] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2669] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2674] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2677] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2679] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2684] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2687] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2690] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: ovs-system: entered promiscuous mode
Jan 20 08:50:51 np0005588920 kernel: Timeout policy base is empty
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2693] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2697] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2701] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2703] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2705] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2709] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2713] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2716] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2718] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2722] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 systemd-udevd[51860]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2725] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2725] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2725] dhcp4 (eth0): state changed no lease
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2727] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2739] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2742] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51855 uid=0 result="fail" reason="Device is not activated"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2747] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2783] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2786] dhcp4 (eth0): state changed new lease, address=38.102.83.30
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2789] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2827] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2836] device (eth1): disconnecting for new activation request.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2837] audit: op="connection-activate" uuid="3d69abfd-ede9-58d4-966b-b715c8218c1f" name="ci-private-network" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2867] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51855 uid=0 result="success"
Jan 20 08:50:51 np0005588920 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.2918] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3088] device (eth1): Activation: starting connection 'ci-private-network' (3d69abfd-ede9-58d4-966b-b715c8218c1f)
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3093] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3102] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3105] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3110] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3113] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3117] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3118] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: br-ex: entered promiscuous mode
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3130] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3131] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3132] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3133] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3136] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3141] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3144] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3146] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3148] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3150] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3153] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3155] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3157] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3160] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3163] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3165] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3167] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3172] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3176] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: vlan22: entered promiscuous mode
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3258] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3259] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3282] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3285] device (eth1): Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3301] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3331] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3339] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: vlan23: entered promiscuous mode
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3380] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3390] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3410] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: vlan20: entered promiscuous mode
Jan 20 08:50:51 np0005588920 systemd-udevd[51859]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3439] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3459] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 kernel: vlan21: entered promiscuous mode
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3481] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 systemd-udevd[51861]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3571] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3615] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3632] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3644] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3645] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3658] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3669] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3697] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3727] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3738] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3744] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3753] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3766] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3773] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588920 NetworkManager[49076]: <info>  [1768917051.3784] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:52 np0005588920 NetworkManager[49076]: <info>  [1768917052.5174] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51855 uid=0 result="success"
Jan 20 08:50:52 np0005588920 NetworkManager[49076]: <info>  [1768917052.6910] checkpoint[0x561ac5920950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 20 08:50:52 np0005588920 NetworkManager[49076]: <info>  [1768917052.6912] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51855 uid=0 result="success"
Jan 20 08:50:52 np0005588920 python3.9[52214]: ansible-ansible.legacy.async_status Invoked with jid=j414307377384.51849 mode=status _async_dir=/root/.ansible_async
Jan 20 08:50:52 np0005588920 NetworkManager[49076]: <info>  [1768917052.9731] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51855 uid=0 result="success"
Jan 20 08:50:52 np0005588920 NetworkManager[49076]: <info>  [1768917052.9743] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51855 uid=0 result="success"
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.2030] audit: op="networking-control" arg="global-dns-configuration" pid=51855 uid=0 result="success"
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.2055] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.2080] audit: op="networking-control" arg="global-dns-configuration" pid=51855 uid=0 result="success"
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.2105] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51855 uid=0 result="success"
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.3441] checkpoint[0x561ac5920a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 20 08:50:53 np0005588920 NetworkManager[49076]: <info>  [1768917053.3444] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51855 uid=0 result="success"
Jan 20 08:50:53 np0005588920 ansible-async_wrapper.py[51853]: Module complete (51853)
Jan 20 08:50:54 np0005588920 ansible-async_wrapper.py[51852]: Done in kid B.
Jan 20 08:50:56 np0005588920 python3.9[52319]: ansible-ansible.legacy.async_status Invoked with jid=j414307377384.51849 mode=status _async_dir=/root/.ansible_async
Jan 20 08:50:57 np0005588920 python3.9[52419]: ansible-ansible.legacy.async_status Invoked with jid=j414307377384.51849 mode=cleanup _async_dir=/root/.ansible_async
Jan 20 08:50:58 np0005588920 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:50:58 np0005588920 python3.9[52571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:50:59 np0005588920 python3.9[52696]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917058.376008-928-278188824444681/.source.returncode _original_basename=.vbezk728 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:00 np0005588920 python3.9[52849]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:01 np0005588920 python3.9[52972]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917060.193659-976-59238733813282/.source.cfg _original_basename=.q7qk8zmh follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:02 np0005588920 python3.9[53126]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:51:02 np0005588920 systemd[1]: Reloading Network Manager...
Jan 20 08:51:02 np0005588920 NetworkManager[49076]: <info>  [1768917062.2855] audit: op="reload" arg="0" pid=53130 uid=0 result="success"
Jan 20 08:51:02 np0005588920 NetworkManager[49076]: <info>  [1768917062.2863] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 20 08:51:02 np0005588920 systemd[1]: Reloaded Network Manager.
Jan 20 08:51:02 np0005588920 systemd[1]: session-11.scope: Deactivated successfully.
Jan 20 08:51:02 np0005588920 systemd[1]: session-11.scope: Consumed 52.419s CPU time.
Jan 20 08:51:02 np0005588920 systemd-logind[783]: Session 11 logged out. Waiting for processes to exit.
Jan 20 08:51:02 np0005588920 systemd-logind[783]: Removed session 11.
Jan 20 08:51:08 np0005588920 systemd-logind[783]: New session 12 of user zuul.
Jan 20 08:51:08 np0005588920 systemd[1]: Started Session 12 of User zuul.
Jan 20 08:51:09 np0005588920 python3.9[53315]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:10 np0005588920 python3.9[53469]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:12 np0005588920 python3.9[53662]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:12 np0005588920 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:51:12 np0005588920 systemd[1]: session-12.scope: Deactivated successfully.
Jan 20 08:51:12 np0005588920 systemd[1]: session-12.scope: Consumed 2.399s CPU time.
Jan 20 08:51:12 np0005588920 systemd-logind[783]: Session 12 logged out. Waiting for processes to exit.
Jan 20 08:51:12 np0005588920 systemd-logind[783]: Removed session 12.
Jan 20 08:51:18 np0005588920 systemd-logind[783]: New session 13 of user zuul.
Jan 20 08:51:18 np0005588920 systemd[1]: Started Session 13 of User zuul.
Jan 20 08:51:19 np0005588920 python3.9[53847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:20 np0005588920 python3.9[54001]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:22 np0005588920 python3.9[54157]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:22 np0005588920 python3.9[54242]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:24 np0005588920 python3.9[54395]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:26 np0005588920 python3.9[54590]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:27 np0005588920 python3.9[54742]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:27 np0005588920 systemd[1]: var-lib-containers-storage-overlay-compat2319383853-merged.mount: Deactivated successfully.
Jan 20 08:51:28 np0005588920 podman[54743]: 2026-01-20 13:51:28.16617504 +0000 UTC m=+1.025229636 system refresh
Jan 20 08:51:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:51:29 np0005588920 python3.9[54905]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:29 np0005588920 python3.9[55028]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917088.4412987-200-209278286732831/.source.json follow=False _original_basename=podman_network_config.j2 checksum=bf7d842865f50ed55853f6e8544753eb4a7814c2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:30 np0005588920 python3.9[55180]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:31 np0005588920 python3.9[55303]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917090.1475685-244-139720690070783/.source.conf follow=False _original_basename=registries.conf.j2 checksum=88781afee5b5da15b4e5a77559a69fa53d49a457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:32 np0005588920 python3.9[55455]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:33 np0005588920 python3.9[55607]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:33 np0005588920 python3.9[55759]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:34 np0005588920 python3.9[55911]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:35 np0005588920 python3.9[56063]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:38 np0005588920 python3.9[56216]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:38 np0005588920 python3.9[56370]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:51:39 np0005588920 python3.9[56522]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:51:40 np0005588920 python3.9[56674]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:41 np0005588920 python3.9[56827]: ansible-service_facts Invoked
Jan 20 08:51:41 np0005588920 network[56844]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 08:51:41 np0005588920 network[56845]: 'network-scripts' will be removed from distribution in near future.
Jan 20 08:51:41 np0005588920 network[56846]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 08:51:47 np0005588920 python3.9[57298]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:50 np0005588920 python3.9[57451]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 20 08:51:52 np0005588920 python3.9[57603]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:52 np0005588920 python3.9[57728]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917111.770254-676-279304517380058/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:53 np0005588920 python3.9[57882]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:54 np0005588920 python3.9[58007]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917113.294787-722-135345482171002/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:56 np0005588920 python3.9[58161]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:58 np0005588920 python3.9[58315]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:59 np0005588920 python3.9[58399]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:01 np0005588920 python3.9[58553]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:52:01 np0005588920 python3.9[58637]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:52:01 np0005588920 chronyd[790]: chronyd exiting
Jan 20 08:52:01 np0005588920 systemd[1]: Stopping NTP client/server...
Jan 20 08:52:01 np0005588920 systemd[1]: chronyd.service: Deactivated successfully.
Jan 20 08:52:01 np0005588920 systemd[1]: Stopped NTP client/server.
Jan 20 08:52:01 np0005588920 systemd[1]: Starting NTP client/server...
Jan 20 08:52:02 np0005588920 chronyd[58646]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 20 08:52:02 np0005588920 chronyd[58646]: Frequency -28.263 +/- 0.236 ppm read from /var/lib/chrony/drift
Jan 20 08:52:02 np0005588920 chronyd[58646]: Loaded seccomp filter (level 2)
Jan 20 08:52:02 np0005588920 systemd[1]: Started NTP client/server.
Jan 20 08:52:02 np0005588920 systemd[1]: session-13.scope: Deactivated successfully.
Jan 20 08:52:02 np0005588920 systemd[1]: session-13.scope: Consumed 27.017s CPU time.
Jan 20 08:52:02 np0005588920 systemd-logind[783]: Session 13 logged out. Waiting for processes to exit.
Jan 20 08:52:02 np0005588920 systemd-logind[783]: Removed session 13.
Jan 20 08:52:08 np0005588920 systemd-logind[783]: New session 14 of user zuul.
Jan 20 08:52:08 np0005588920 systemd[1]: Started Session 14 of User zuul.
Jan 20 08:52:09 np0005588920 python3.9[58827]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:10 np0005588920 python3.9[58979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:10 np0005588920 python3.9[59104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917129.5224426-64-42316621989266/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:11 np0005588920 systemd[1]: session-14.scope: Deactivated successfully.
Jan 20 08:52:11 np0005588920 systemd[1]: session-14.scope: Consumed 1.907s CPU time.
Jan 20 08:52:11 np0005588920 systemd-logind[783]: Session 14 logged out. Waiting for processes to exit.
Jan 20 08:52:11 np0005588920 systemd-logind[783]: Removed session 14.
Jan 20 08:52:17 np0005588920 systemd-logind[783]: New session 15 of user zuul.
Jan 20 08:52:17 np0005588920 systemd[1]: Started Session 15 of User zuul.
Jan 20 08:52:18 np0005588920 python3.9[59282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:52:20 np0005588920 python3.9[59438]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:21 np0005588920 python3.9[59613]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:22 np0005588920 python3.9[59736]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1768917141.1865673-85-235797608516828/.source.json _original_basename=.zlklqpqa follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:23 np0005588920 python3.9[59888]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:24 np0005588920 python3.9[60011]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917143.1004493-155-223280779003961/.source _original_basename=.8u8x0y36 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:25 np0005588920 python3.9[60163]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:26 np0005588920 python3.9[60315]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:26 np0005588920 python3.9[60438]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917145.4649131-226-267230689320209/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:27 np0005588920 python3.9[60590]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:28 np0005588920 python3.9[60713]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917146.789327-226-168596050970115/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:28 np0005588920 python3.9[60865]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:29 np0005588920 python3.9[61017]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:30 np0005588920 python3.9[61140]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917149.2291164-337-193479571764720/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:31 np0005588920 python3.9[61292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:31 np0005588920 python3.9[61415]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917150.633923-382-275233899259889/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:32 np0005588920 python3.9[61567]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:32 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:32 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:32 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:33 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:33 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:33 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:33 np0005588920 systemd[1]: Starting EDPM Container Shutdown...
Jan 20 08:52:33 np0005588920 systemd[1]: Finished EDPM Container Shutdown.
Jan 20 08:52:34 np0005588920 python3.9[61793]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:34 np0005588920 python3.9[61916]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917153.8306756-452-89674164391326/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:36 np0005588920 python3.9[62068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:36 np0005588920 python3.9[62191]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917155.3305733-497-240935882489495/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:37 np0005588920 python3.9[62343]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:37 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:37 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:37 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:38 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:38 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:38 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:38 np0005588920 systemd[1]: Starting Create netns directory...
Jan 20 08:52:38 np0005588920 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 08:52:38 np0005588920 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 08:52:38 np0005588920 systemd[1]: Finished Create netns directory.
Jan 20 08:52:40 np0005588920 python3.9[62569]: ansible-ansible.builtin.service_facts Invoked
Jan 20 08:52:40 np0005588920 network[62586]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 08:52:40 np0005588920 network[62587]: 'network-scripts' will be removed from distribution in near future.
Jan 20 08:52:40 np0005588920 network[62588]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 08:52:46 np0005588920 python3.9[62850]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:46 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:46 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:46 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:46 np0005588920 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 20 08:52:46 np0005588920 iptables.init[62889]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 20 08:52:46 np0005588920 iptables.init[62889]: iptables: Flushing firewall rules: [  OK  ]
Jan 20 08:52:46 np0005588920 systemd[1]: iptables.service: Deactivated successfully.
Jan 20 08:52:46 np0005588920 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 20 08:52:47 np0005588920 python3.9[63085]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:49 np0005588920 python3.9[63239]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:49 np0005588920 systemd[1]: Reloading.
Jan 20 08:52:49 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:49 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:49 np0005588920 systemd[1]: Starting Netfilter Tables...
Jan 20 08:52:49 np0005588920 systemd[1]: Finished Netfilter Tables.
Jan 20 08:52:51 np0005588920 python3.9[63430]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:52:52 np0005588920 python3.9[63583]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:53 np0005588920 python3.9[63708]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917172.3395317-704-53530927268509/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:54 np0005588920 python3.9[63861]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:52:54 np0005588920 systemd[1]: Reloading OpenSSH server daemon...
Jan 20 08:52:54 np0005588920 systemd[1]: Reloaded OpenSSH server daemon.
Jan 20 08:52:55 np0005588920 python3.9[64017]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:56 np0005588920 python3.9[64169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:56 np0005588920 python3.9[64292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917175.763319-796-220837746196133/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:58 np0005588920 python3.9[64444]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 08:52:58 np0005588920 systemd[1]: Starting Time & Date Service...
Jan 20 08:52:58 np0005588920 systemd[1]: Started Time & Date Service.
Jan 20 08:52:59 np0005588920 python3.9[64600]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:59 np0005588920 python3.9[64752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:00 np0005588920 python3.9[64875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917179.4286883-901-228944578175228/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:01 np0005588920 python3.9[65027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:02 np0005588920 python3.9[65150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917180.9065073-947-192212200193929/.source.yaml _original_basename=.f7dknz6k follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:02 np0005588920 python3.9[65302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:03 np0005588920 python3.9[65425]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917182.385884-992-191543848844436/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:04 np0005588920 python3.9[65577]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:05 np0005588920 python3.9[65730]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:06 np0005588920 python3[65883]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 08:53:06 np0005588920 python3.9[66035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:07 np0005588920 python3.9[66158]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917186.4193597-1109-189892783016564/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:08 np0005588920 python3.9[66310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:09 np0005588920 python3.9[66433]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917188.0069966-1154-98979131439409/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:10 np0005588920 python3.9[66585]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:10 np0005588920 python3.9[66708]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917189.4694932-1199-122880312544043/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:11 np0005588920 python3.9[66860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:12 np0005588920 python3.9[66983]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917190.9091074-1244-176816271937339/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:13 np0005588920 python3.9[67135]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:13 np0005588920 python3.9[67258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917192.404379-1289-137081956458217/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:14 np0005588920 python3.9[67410]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:15 np0005588920 python3.9[67562]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:16 np0005588920 python3.9[67721]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:17 np0005588920 python3.9[67874]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:17 np0005588920 python3.9[68026]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:19 np0005588920 python3.9[68178]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 08:53:19 np0005588920 python3.9[68333]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 08:53:20 np0005588920 systemd[1]: session-15.scope: Deactivated successfully.
Jan 20 08:53:20 np0005588920 systemd[1]: session-15.scope: Consumed 39.438s CPU time.
Jan 20 08:53:20 np0005588920 systemd-logind[783]: Session 15 logged out. Waiting for processes to exit.
Jan 20 08:53:20 np0005588920 systemd-logind[783]: Removed session 15.
Jan 20 08:53:28 np0005588920 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 08:53:33 np0005588920 systemd-logind[783]: New session 16 of user zuul.
Jan 20 08:53:33 np0005588920 systemd[1]: Started Session 16 of User zuul.
Jan 20 08:53:33 np0005588920 python3.9[68516]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 20 08:53:34 np0005588920 python3.9[68668]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:53:35 np0005588920 python3.9[68820]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:53:37 np0005588920 python3.9[68972]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrCUasX8PhlctvvIb2eE6+Z0hELmfczQ6UoBD+mPtCobptr/s786JmwJ3D8nIoKhlCLVSmhRfbqf1Pm45RUPTEtSuaa6HBDy40dZhTXU34X4KbGfKmur2bp9S/1w83ArKvI8inSqqk2qoMx1l7ECkEgeT+GbFwKfYLnbq5OV4Ms3tzl/uFUC/Xzxs2dbXlhozQiSamcO/a6EObErTvR8PrtaOoLFtTiD/I+oN+rkdBPkBc6r0qT4jS7nU1FOlT96meSZHE7Q1n8pxcy9PEc8w9hFdd1Zj8/WcGIdeEJsekuouK1Lut/sofQLZHyUMWJTcnBjx8BsjGx9NjUHPYUWIw+DZo7lT2QurAPNnaX4rp9ciGV2Bdm3ylNoOu3izNvM1JGTw3xRyYrmyxyWv3Euc35JXa0w07Xrqr+6Ckih0WTLU6q3Rlnrc/grpDC821sHrsljerHipJVOCbZB39LvV6wDDBlqfYZzfqID3dIqlVli4eL12J0K7jr7QAlPRhNf0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOG07miJwhzuA/nm0wvGIorydl2xbBiiDhE7PypnJ/jC#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKiJpWtps/bRsuEHfak4zDuqPHKOWFLaEA2h86H7tPlrZHR8okAVZWCmY7keO3Ad1DFyffUtJPKv5OvTK91xGO8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/3N9PJZXpat0uFh2x2RoV9B0Ih74HU9CPf+g/5HncM7gCVvCpW3CBde1qNDRU2iY9rzpOVPwzi4YzoAUcxB5KAiqZOI9ylmzfiD8JXQ+myLmIRLxHOdXFaEQ4mMp4W+X37hCZ6sdfm6Yqd6eqBuZrM/72ltYoewWBNCG/Hgqzu30L9WC4+BF+iADHT7Qnmvh/cc9U71WxB4h2ikBo1SdGoFCqoez7ajitqx+dw7VWaOtEPliS0LZuDtN3Zt/cBBgxhb/FaAEI3jRP2ej9X0NJW91YxzBygyxiVasslx92g/GmnDFOWVZb5ai/JJsNH6pLTjs25IzvnuWIf8/ZLgZ03zziR4mBLP12CIVF8g1CzaqK1IILDKkjS/dzDiTBefmiQ2+N0i5EEXOgmxchqOqTkFPQg/ar0+0uBPkwzAI0HDk99czhyYHFlO+PhnULVkL1z+XLwHBgOrbNNVQQcJCvady4Gadh66mu1UrLpryNYOgZiugZi67Biha4ZPzPHok=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF41dx3BXAuEvQwQNtbUM7rIrbaOLr5CRvYNdDD+UMr9#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENFrTpm22/xEaEJMzd7C5WyJttJdK+HK5kxP8/NuvvAQSlLtEulBZnvD/OX5hk3/sDYhPQelj3YsNX1Plw5PJQ=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/dEYtIJ/delwiq9xMMctU8myoGU/TKMiFUM+i3BSaGKrC0rujad6qo1LAtjth5aYbBcgBhxy0UEX0oCruQQgc5qDpPmWHmJiAwdQJaDu6GxTRl3PlXF2u4rd0Rz72DAMuCxPSYedeHU91uL4vlrcD95xONWew2wa9lUuqQWdgj8DtqnB9T895BihDk9vFLXAaoGJcYZVGKJmXR8sOzNTFQxefqstVO0/dfbRUyFd0Ukp5v7rTmLxw0Np5WcGMOg9l/iRzWTopxnTRvXpBoGlFCmzNvTG2uH08dJ4FU5Wk9/iSxonuiVJu9DKs8Tp4EajaA4Y6cEuZiMhhqi7vw6zVCQuCmRBpny6Ub1Ag2CesMYgxwOVJO5cHsKh3BzuPFsh1gMgrrZK7v+qfm2r1rhHlPsCWrcnrtUIZa7gyzdFvHytTh/4uyGMgNpbwxkyCxgSN4PleQy2wvxy/DFW+JxCDzI4jK9LFH5aojzEhUtj+P3E7CXL/wRPxDJdfEU6PhTk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEa1zL0TUD00vr72wZq3y4rgtSnctWBvs+gME/0/EAsV#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI2WwWe4rQW0CaFwcmci1J5n144T87fcxCH+Y2CVZd5XQ7Cvzlhh1cGNDX81Tng3KgxvKOuz3mdiSCLqx8noiD0=#012 create=True mode=0644 path=/tmp/ansible._c7cdcm0 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:38 np0005588920 python3.9[69124]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible._c7cdcm0' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:39 np0005588920 python3.9[69278]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible._c7cdcm0 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:40 np0005588920 systemd[1]: session-16.scope: Deactivated successfully.
Jan 20 08:53:40 np0005588920 systemd[1]: session-16.scope: Consumed 4.036s CPU time.
Jan 20 08:53:40 np0005588920 systemd-logind[783]: Session 16 logged out. Waiting for processes to exit.
Jan 20 08:53:40 np0005588920 systemd-logind[783]: Removed session 16.
Jan 20 08:53:46 np0005588920 systemd-logind[783]: New session 17 of user zuul.
Jan 20 08:53:46 np0005588920 systemd[1]: Started Session 17 of User zuul.
Jan 20 08:53:47 np0005588920 python3.9[69456]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:53:49 np0005588920 python3.9[69612]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 08:53:50 np0005588920 python3.9[69766]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:53:51 np0005588920 python3.9[69919]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:53 np0005588920 python3.9[70072]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:53:54 np0005588920 python3.9[70226]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:55 np0005588920 python3.9[70381]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:55 np0005588920 systemd[1]: session-17.scope: Deactivated successfully.
Jan 20 08:53:55 np0005588920 systemd[1]: session-17.scope: Consumed 5.026s CPU time.
Jan 20 08:53:55 np0005588920 systemd-logind[783]: Session 17 logged out. Waiting for processes to exit.
Jan 20 08:53:55 np0005588920 systemd-logind[783]: Removed session 17.
Jan 20 08:54:01 np0005588920 systemd-logind[783]: New session 18 of user zuul.
Jan 20 08:54:01 np0005588920 systemd[1]: Started Session 18 of User zuul.
Jan 20 08:54:02 np0005588920 python3.9[70559]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:54:03 np0005588920 python3.9[70715]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:54:04 np0005588920 python3.9[70799]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 08:54:06 np0005588920 python3.9[70950]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:08 np0005588920 python3.9[71101]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 08:54:09 np0005588920 python3.9[71251]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:54:09 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:54:09 np0005588920 python3.9[71402]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:54:10 np0005588920 systemd[1]: session-18.scope: Deactivated successfully.
Jan 20 08:54:10 np0005588920 systemd[1]: session-18.scope: Consumed 6.233s CPU time.
Jan 20 08:54:10 np0005588920 systemd-logind[783]: Session 18 logged out. Waiting for processes to exit.
Jan 20 08:54:10 np0005588920 systemd-logind[783]: Removed session 18.
Jan 20 08:54:12 np0005588920 chronyd[58646]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 20 08:54:19 np0005588920 systemd-logind[783]: New session 19 of user zuul.
Jan 20 08:54:19 np0005588920 systemd[1]: Started Session 19 of User zuul.
Jan 20 08:54:25 np0005588920 python3[72168]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:54:27 np0005588920 python3[72263]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 20 08:54:29 np0005588920 python3[72290]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 20 08:54:30 np0005588920 python3[72316]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:30 np0005588920 kernel: loop: module loaded
Jan 20 08:54:30 np0005588920 kernel: loop3: detected capacity change from 0 to 14680064
Jan 20 08:54:30 np0005588920 python3[72353]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:30 np0005588920 lvm[72356]: PV /dev/loop3 not used.
Jan 20 08:54:30 np0005588920 lvm[72358]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:30 np0005588920 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 20 08:54:30 np0005588920 lvm[72361]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 20 08:54:30 np0005588920 lvm[72368]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:30 np0005588920 lvm[72368]: VG ceph_vg0 finished
Jan 20 08:54:30 np0005588920 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 20 08:54:31 np0005588920 python3[72446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:54:31 np0005588920 python3[72519]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768917271.0321026-36959-196993486286706/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:54:32 np0005588920 python3[72569]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:54:32 np0005588920 systemd[1]: Reloading.
Jan 20 08:54:32 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:54:32 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:54:32 np0005588920 systemd[1]: Starting Ceph OSD losetup...
Jan 20 08:54:32 np0005588920 bash[72609]: /dev/loop3: [64513]:4328449 (/var/lib/ceph-osd-0.img)
Jan 20 08:54:33 np0005588920 systemd[1]: Finished Ceph OSD losetup.
Jan 20 08:54:33 np0005588920 lvm[72611]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:33 np0005588920 lvm[72611]: VG ceph_vg0 finished
Jan 20 08:54:35 np0005588920 python3[72635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:56:29 np0005588920 systemd[1]: Created slice User Slice of UID 42477.
Jan 20 08:56:29 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 20 08:56:29 np0005588920 systemd-logind[783]: New session 20 of user ceph-admin.
Jan 20 08:56:29 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 20 08:56:29 np0005588920 systemd[1]: Starting User Manager for UID 42477...
Jan 20 08:56:29 np0005588920 systemd[72686]: Queued start job for default target Main User Target.
Jan 20 08:56:29 np0005588920 systemd[72686]: Created slice User Application Slice.
Jan 20 08:56:29 np0005588920 systemd[72686]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 08:56:29 np0005588920 systemd[72686]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 08:56:29 np0005588920 systemd[72686]: Reached target Paths.
Jan 20 08:56:29 np0005588920 systemd[72686]: Reached target Timers.
Jan 20 08:56:29 np0005588920 systemd[72686]: Starting D-Bus User Message Bus Socket...
Jan 20 08:56:29 np0005588920 systemd[72686]: Starting Create User's Volatile Files and Directories...
Jan 20 08:56:29 np0005588920 systemd-logind[783]: New session 22 of user ceph-admin.
Jan 20 08:56:29 np0005588920 systemd[72686]: Listening on D-Bus User Message Bus Socket.
Jan 20 08:56:29 np0005588920 systemd[72686]: Reached target Sockets.
Jan 20 08:56:29 np0005588920 systemd[72686]: Finished Create User's Volatile Files and Directories.
Jan 20 08:56:29 np0005588920 systemd[72686]: Reached target Basic System.
Jan 20 08:56:29 np0005588920 systemd[72686]: Reached target Main User Target.
Jan 20 08:56:29 np0005588920 systemd[72686]: Startup finished in 142ms.
Jan 20 08:56:29 np0005588920 systemd[1]: Started User Manager for UID 42477.
Jan 20 08:56:29 np0005588920 systemd[1]: Started Session 20 of User ceph-admin.
Jan 20 08:56:29 np0005588920 systemd[1]: Started Session 22 of User ceph-admin.
Jan 20 08:56:29 np0005588920 systemd-logind[783]: New session 23 of user ceph-admin.
Jan 20 08:56:29 np0005588920 systemd[1]: Started Session 23 of User ceph-admin.
Jan 20 08:56:30 np0005588920 systemd-logind[783]: New session 24 of user ceph-admin.
Jan 20 08:56:30 np0005588920 systemd[1]: Started Session 24 of User ceph-admin.
Jan 20 08:56:30 np0005588920 systemd-logind[783]: New session 25 of user ceph-admin.
Jan 20 08:56:30 np0005588920 systemd[1]: Started Session 25 of User ceph-admin.
Jan 20 08:56:31 np0005588920 systemd-logind[783]: New session 26 of user ceph-admin.
Jan 20 08:56:31 np0005588920 systemd[1]: Started Session 26 of User ceph-admin.
Jan 20 08:56:31 np0005588920 systemd-logind[783]: New session 27 of user ceph-admin.
Jan 20 08:56:31 np0005588920 systemd[1]: Started Session 27 of User ceph-admin.
Jan 20 08:56:32 np0005588920 systemd-logind[783]: New session 28 of user ceph-admin.
Jan 20 08:56:32 np0005588920 systemd[1]: Started Session 28 of User ceph-admin.
Jan 20 08:56:32 np0005588920 systemd-logind[783]: New session 29 of user ceph-admin.
Jan 20 08:56:32 np0005588920 systemd[1]: Started Session 29 of User ceph-admin.
Jan 20 08:56:33 np0005588920 systemd-logind[783]: New session 30 of user ceph-admin.
Jan 20 08:56:33 np0005588920 systemd[1]: Started Session 30 of User ceph-admin.
Jan 20 08:56:33 np0005588920 systemd-logind[783]: New session 31 of user ceph-admin.
Jan 20 08:56:33 np0005588920 systemd[1]: Started Session 31 of User ceph-admin.
Jan 20 08:56:34 np0005588920 systemd-logind[783]: New session 32 of user ceph-admin.
Jan 20 08:56:34 np0005588920 systemd[1]: Started Session 32 of User ceph-admin.
Jan 20 08:56:34 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:25 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:25 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:25 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:25 np0005588920 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73709 (sysctl)
Jan 20 08:57:25 np0005588920 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 20 08:57:25 np0005588920 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 20 08:57:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:29 np0005588920 systemd[1]: var-lib-containers-storage-overlay-compat2538297487-lower\x2dmapped.mount: Deactivated successfully.
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:42.967240644 +0000 UTC m=+15.941445144 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.003817051 +0000 UTC m=+15.978021591 container create b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:43 np0005588920 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 20 08:57:43 np0005588920 systemd[1]: Started libpod-conmon-b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881.scope.
Jan 20 08:57:43 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.099662429 +0000 UTC m=+16.073866959 container init b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.108801603 +0000 UTC m=+16.083006113 container start b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.112217904 +0000 UTC m=+16.086422404 container attach b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:57:43 np0005588920 amazing_cerf[74048]: 167 167
Jan 20 08:57:43 np0005588920 systemd[1]: libpod-b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881.scope: Deactivated successfully.
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.11430489 +0000 UTC m=+16.088509410 container died b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True)
Jan 20 08:57:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay-a0dfac0927803c1fcd74ba92f3862cf212b5ee755b9fffc4e3819a99d7cd8c08-merged.mount: Deactivated successfully.
Jan 20 08:57:43 np0005588920 podman[73985]: 2026-01-20 13:57:43.146715515 +0000 UTC m=+16.120920025 container remove b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_cerf, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:43 np0005588920 systemd[1]: libpod-conmon-b7cecb4da5ed144ada5b1607503e91a26fc441bef308abe3c96f8fbaa33e3881.scope: Deactivated successfully.
Jan 20 08:57:43 np0005588920 podman[74073]: 2026-01-20 13:57:43.286845785 +0000 UTC m=+0.039462394 container create 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:43 np0005588920 systemd[1]: Started libpod-conmon-5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc.scope.
Jan 20 08:57:43 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:57:43 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03d4ebe36a541b9c47e4b0ac5d6ebda217ef9bdfe5e73e8450bed4be61e98dc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:43 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c03d4ebe36a541b9c47e4b0ac5d6ebda217ef9bdfe5e73e8450bed4be61e98dc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:43 np0005588920 podman[74073]: 2026-01-20 13:57:43.363906892 +0000 UTC m=+0.116523551 container init 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 20 08:57:43 np0005588920 podman[74073]: 2026-01-20 13:57:43.269829801 +0000 UTC m=+0.022446410 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:43 np0005588920 podman[74073]: 2026-01-20 13:57:43.369581313 +0000 UTC m=+0.122197922 container start 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 20 08:57:43 np0005588920 podman[74073]: 2026-01-20 13:57:43.372497061 +0000 UTC m=+0.125113710 container attach 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]: [
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:    {
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "available": false,
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "ceph_device": false,
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "lsm_data": {},
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "lvs": [],
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "path": "/dev/sr0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "rejected_reasons": [
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "Has a FileSystem",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "Insufficient space (<5GB)"
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        ],
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        "sys_api": {
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "actuators": null,
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "device_nodes": "sr0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "devname": "sr0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "human_readable_size": "482.00 KB",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "id_bus": "ata",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "model": "QEMU DVD-ROM",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "nr_requests": "2",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "parent": "/dev/sr0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "partitions": {},
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "path": "/dev/sr0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "removable": "1",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "rev": "2.5+",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "ro": "0",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "rotational": "1",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "sas_address": "",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "sas_device_handle": "",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "scheduler_mode": "mq-deadline",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "sectors": 0,
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "sectorsize": "2048",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "size": 493568.0,
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "support_discard": "2048",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "type": "disk",
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:            "vendor": "QEMU"
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:        }
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]:    }
Jan 20 08:57:44 np0005588920 magical_dubinsky[74090]: ]
Jan 20 08:57:44 np0005588920 systemd[1]: libpod-5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc.scope: Deactivated successfully.
Jan 20 08:57:44 np0005588920 podman[74073]: 2026-01-20 13:57:44.417603945 +0000 UTC m=+1.170220554 container died 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:44 np0005588920 systemd[1]: libpod-5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc.scope: Consumed 1.037s CPU time.
Jan 20 08:57:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c03d4ebe36a541b9c47e4b0ac5d6ebda217ef9bdfe5e73e8450bed4be61e98dc-merged.mount: Deactivated successfully.
Jan 20 08:57:44 np0005588920 podman[74073]: 2026-01-20 13:57:44.466742696 +0000 UTC m=+1.219359305 container remove 5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_dubinsky, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:57:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:44 np0005588920 systemd[1]: libpod-conmon-5a236d0fe5765e411f35c077ab66f8a3b429658a513ff86a4d6cec35fdd84dcc.scope: Deactivated successfully.
Jan 20 08:57:49 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:49 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.854524689 +0000 UTC m=+0.048064804 container create c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 20 08:57:49 np0005588920 systemd[1]: Started libpod-conmon-c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2.scope.
Jan 20 08:57:49 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.917246753 +0000 UTC m=+0.110786868 container init c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.9242503 +0000 UTC m=+0.117790405 container start c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.830669802 +0000 UTC m=+0.024210007 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.927537017 +0000 UTC m=+0.121077222 container attach c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:49 np0005588920 laughing_fermi[76816]: 167 167
Jan 20 08:57:49 np0005588920 systemd[1]: libpod-c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2.scope: Deactivated successfully.
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.929987703 +0000 UTC m=+0.123527818 container died c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 20 08:57:49 np0005588920 podman[76799]: 2026-01-20 13:57:49.978406085 +0000 UTC m=+0.171946210 container remove c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_fermi, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 20 08:57:49 np0005588920 systemd[1]: libpod-conmon-c4a60e00f91a358c2cd4018ec779008c7b2d490efdc7e10ed2b92396f2cde5f2.scope: Deactivated successfully.
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.068728636 +0000 UTC m=+0.049578504 container create bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:50 np0005588920 systemd[1]: Started libpod-conmon-bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd.scope.
Jan 20 08:57:50 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:57:50 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3e527cfeebb58db4f67fae7d464fd63abd11c2feefec94ee1b979de7590f69/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:50 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3e527cfeebb58db4f67fae7d464fd63abd11c2feefec94ee1b979de7590f69/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:50 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3e527cfeebb58db4f67fae7d464fd63abd11c2feefec94ee1b979de7590f69/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:50 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3e527cfeebb58db4f67fae7d464fd63abd11c2feefec94ee1b979de7590f69/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.130734821 +0000 UTC m=+0.111584709 container init bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.13707225 +0000 UTC m=+0.117922118 container start bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.046603465 +0000 UTC m=+0.027453373 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.140496681 +0000 UTC m=+0.121346619 container attach bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 20 08:57:50 np0005588920 systemd[1]: libpod-bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd.scope: Deactivated successfully.
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.215672678 +0000 UTC m=+0.196522546 container died bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 20 08:57:50 np0005588920 podman[76834]: 2026-01-20 13:57:50.246508821 +0000 UTC m=+0.227358689 container remove bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 20 08:57:50 np0005588920 systemd[1]: libpod-conmon-bfb5ecfacabeed1ce79d1a6bed0cc70a61a9387946bfe01056004b92ffb3d4bd.scope: Deactivated successfully.
Jan 20 08:57:50 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:50 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:50 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:50 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:50 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:50 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:50 np0005588920 systemd[1]: Reached target All Ceph clusters and services.
Jan 20 08:57:50 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:51 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:51 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:51 np0005588920 systemd[1]: Reached target Ceph cluster e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:51 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:51 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:51 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:51 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:51 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:51 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:51 np0005588920 systemd[1]: Created slice Slice /system/ceph-e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:51 np0005588920 systemd[1]: Reached target System Time Set.
Jan 20 08:57:51 np0005588920 systemd[1]: Reached target System Time Synchronized.
Jan 20 08:57:51 np0005588920 systemd[1]: Starting Ceph mon.compute-2 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:57:51 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:51 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:51 np0005588920 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:51 np0005588920 podman[77129]: 2026-01-20 13:57:51.928822302 +0000 UTC m=+0.050382346 container create 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507)
Jan 20 08:57:51 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbc2dd108660e1b5f4428ca84605e125b0f729eb7282c8c578f3fa1e5585e4fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:51 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbc2dd108660e1b5f4428ca84605e125b0f729eb7282c8c578f3fa1e5585e4fa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:51 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbc2dd108660e1b5f4428ca84605e125b0f729eb7282c8c578f3fa1e5585e4fa/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:51 np0005588920 podman[77129]: 2026-01-20 13:57:51.990415926 +0000 UTC m=+0.111976000 container init 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 08:57:51 np0005588920 podman[77129]: 2026-01-20 13:57:51.995614055 +0000 UTC m=+0.117174099 container start 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:51 np0005588920 podman[77129]: 2026-01-20 13:57:51.90253366 +0000 UTC m=+0.024093804 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:51 np0005588920 bash[77129]: 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3
Jan 20 08:57:52 np0005588920 systemd[1]: Started Ceph mon.compute-2 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: pidfile_write: ignore empty --pid-file
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: load: jerasure load: lrc 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Git sha 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: DB SUMMARY
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: DB Session ID:  2UAGHK3PX46HCRM7QXU4
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                                     Options.env: 0x564a2e604c40
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                                Options.info_log: 0x564a2f978fc0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                                 Options.wal_dir: 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                    Options.write_buffer_manager: 0x564a2f988b40
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                               Options.row_cache: None
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                              Options.wal_filter: None
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.wal_compression: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.max_background_jobs: 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.max_total_wal_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:       Options.compaction_readahead_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Compression algorithms supported:
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kZSTD supported: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:           Options.merge_operator: 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564a2f978c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x564a2f9711f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:        Options.write_buffer_size: 33554432
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:  Options.max_write_buffer_number: 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.compression: NoCompression
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3345eece-ed87-47a3-81a4-4a6b71655d31
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917472037159, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917472038927, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917472039086, "job": 1, "event": "recovery_finished"}
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564a2f99ae00
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: DB pointer 0x564a2fa24000
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid e399cf45-e6b6-5393-99f1-75c601d3f188
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(???) e0 preinit fsid e399cf45-e6b6-5393-99f1-75c601d3f188
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e25 crush map has features 3314933000852226048, adjusting msgr requires
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3530884063' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: Updating compute-2:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.client.admin.keyring
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3880793223' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: Deploying daemon mon.compute-2 on compute-2
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3880793223' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3950308669' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3950308669' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 20 08:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 20 08:57:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 20 08:57:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-20T13:57:50.187764Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,os=Linux}
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: Deploying daemon mon.compute-1 on compute-1
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-0 calling monitor election
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3099254653' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2 calling monitor election
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: Health detail: HEALTH_WARN 6 pool(s) do not have an application enabled
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: [WRN] POOL_APP_NOT_ENABLED: 6 pool(s) do not have an application enabled
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'vms'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'volumes'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'backups'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'images'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'cephfs.cephfs.meta'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    application not enabled on pool 'cephfs.cephfs.data'
Jan 20 08:57:57 np0005588920 ceph-mon[77148]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e26 e26: 2 total, 2 up, 2 in
Jan 20 08:57:57 np0005588920 podman[77327]: 2026-01-20 13:57:57.986456203 +0000 UTC m=+0.054650059 container create aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:58 np0005588920 systemd[1]: Started libpod-conmon-aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc.scope.
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:57.957749647 +0000 UTC m=+0.025943603 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:58 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:58.079028964 +0000 UTC m=+0.147222910 container init aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:58.08709629 +0000 UTC m=+0.155290156 container start aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:58.091190708 +0000 UTC m=+0.159384604 container attach aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 20 08:57:58 np0005588920 systemd[1]: libpod-aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc.scope: Deactivated successfully.
Jan 20 08:57:58 np0005588920 jovial_euclid[77344]: 167 167
Jan 20 08:57:58 np0005588920 conmon[77344]: conmon aa7244015b79279fcb9e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc.scope/container/memory.events
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:58.098427701 +0000 UTC m=+0.166621557 container died aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:58 np0005588920 systemd[1]: var-lib-containers-storage-overlay-dba862aaa90bfc0ab1492cd4c1d0656fea656ae46cbc1d27fee831d5234a4df1-merged.mount: Deactivated successfully.
Jan 20 08:57:58 np0005588920 podman[77327]: 2026-01-20 13:57:58.14446108 +0000 UTC m=+0.212654936 container remove aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:58 np0005588920 systemd[1]: libpod-conmon-aa7244015b79279fcb9e3b35850e104d0c35e425ab439dc291d0539cfeedcffc.scope: Deactivated successfully.
Jan 20 08:57:58 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3099254653' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: Deploying daemon mgr.compute-2.gunjko on compute-2
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e27 e27: 2 total, 2 up, 2 in
Jan 20 08:57:58 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:58 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:58 np0005588920 systemd[1]: Reloading.
Jan 20 08:57:58 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:58 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 20 08:57:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:57:58 np0005588920 systemd[1]: Starting Ceph mgr.compute-2.gunjko for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:57:59 np0005588920 podman[77488]: 2026-01-20 13:57:59.01330543 +0000 UTC m=+0.053416517 container create 76da91143abbb03da023ff62b56d0908cc192474485ac4455bdbe25c54e94e3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 20 08:57:59 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/879c7a3fc096b6025b6d9c04247290a71e9f3f71c0643fd084b6ea789e126530/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:59 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/879c7a3fc096b6025b6d9c04247290a71e9f3f71c0643fd084b6ea789e126530/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:59 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/879c7a3fc096b6025b6d9c04247290a71e9f3f71c0643fd084b6ea789e126530/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:59 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/879c7a3fc096b6025b6d9c04247290a71e9f3f71c0643fd084b6ea789e126530/merged/var/lib/ceph/mgr/ceph-compute-2.gunjko supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:59 np0005588920 podman[77488]: 2026-01-20 13:57:58.985024585 +0000 UTC m=+0.025135662 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:59 np0005588920 podman[77488]: 2026-01-20 13:57:59.091760944 +0000 UTC m=+0.131872041 container init 76da91143abbb03da023ff62b56d0908cc192474485ac4455bdbe25c54e94e3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 20 08:57:59 np0005588920 podman[77488]: 2026-01-20 13:57:59.109496157 +0000 UTC m=+0.149607214 container start 76da91143abbb03da023ff62b56d0908cc192474485ac4455bdbe25c54e94e3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:59 np0005588920 bash[77488]: 76da91143abbb03da023ff62b56d0908cc192474485ac4455bdbe25c54e94e3d
Jan 20 08:57:59 np0005588920 systemd[1]: Started Ceph mgr.compute-2.gunjko for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:57:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:57:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e28 e28: 2 total, 2 up, 2 in
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1076842494' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-0 calling monitor election
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-2 calling monitor election
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-1 calling monitor election
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: Health detail: HEALTH_WARN 5 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: [WRN] POOL_APP_NOT_ENABLED: 5 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    application not enabled on pool 'volumes'
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    application not enabled on pool 'backups'
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    application not enabled on pool 'images'
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    application not enabled on pool 'cephfs.cephfs.meta'
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    application not enabled on pool 'cephfs.cephfs.data'
Jan 20 08:58:03 np0005588920 ceph-mon[77148]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e29 e29: 2 total, 2 up, 2 in
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1076842494' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.oweoeg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.oweoeg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: Deploying daemon mgr.compute-1.oweoeg on compute-1
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: pidfile_write: ignore empty --pid-file
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'alerts'
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'balancer'
Jan 20 08:58:05 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:05.564+0000 7f425d6d4140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e30 e30: 2 total, 2 up, 2 in
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1913464166' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1913464166' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 20 08:58:05 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'cephadm'
Jan 20 08:58:05 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:05.815+0000 7f425d6d4140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.51902179 +0000 UTC m=+0.036704610 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.674927872 +0000 UTC m=+0.192610652 container create bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 08:58:06 np0005588920 systemd[1]: Started libpod-conmon-bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded.scope.
Jan 20 08:58:06 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e31 e31: 2 total, 2 up, 2 in
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.78877063 +0000 UTC m=+0.306453410 container init bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.798731896 +0000 UTC m=+0.316414636 container start bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.802707082 +0000 UTC m=+0.320389912 container attach bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:06 np0005588920 systemd[1]: libpod-bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded.scope: Deactivated successfully.
Jan 20 08:58:06 np0005588920 exciting_babbage[77690]: 167 167
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.808431035 +0000 UTC m=+0.326113795 container died bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Jan 20 08:58:06 np0005588920 conmon[77690]: conmon bb4f5942136fc589c433 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded.scope/container/memory.events
Jan 20 08:58:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay-648f4fbe8b4f9e9041874f0b8a19177e52cb46b8a747e5fbbd8681a05dce0d26-merged.mount: Deactivated successfully.
Jan 20 08:58:06 np0005588920 podman[77673]: 2026-01-20 13:58:06.879588914 +0000 UTC m=+0.397271694 container remove bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_babbage, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:06 np0005588920 systemd[1]: libpod-conmon-bb4f5942136fc589c433bd1dc3d17d468caaedc1866713a32ac3ac27637dfded.scope: Deactivated successfully.
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1019929416 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:07 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:07 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:07 np0005588920 systemd[1]: Starting Ceph crash.compute-2 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 20 08:58:07 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'crash'
Jan 20 08:58:07 np0005588920 podman[77844]: 2026-01-20 13:58:07.876299717 +0000 UTC m=+0.020018075 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:07 np0005588920 podman[77844]: 2026-01-20 13:58:07.975231958 +0000 UTC m=+0.118950306 container create b5aaabf9e81d8b9c0ebf4f99e7237e601f60096db3310192c81d12433431e813 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: Deploying daemon crash.compute-2 on compute-2
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4079761379' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4079761379' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9cc70f27edf88932c0b43899e97247334af0c6a70a974953449ffbbd56a0c7b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9cc70f27edf88932c0b43899e97247334af0c6a70a974953449ffbbd56a0c7b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9cc70f27edf88932c0b43899e97247334af0c6a70a974953449ffbbd56a0c7b/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9cc70f27edf88932c0b43899e97247334af0c6a70a974953449ffbbd56a0c7b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:08 np0005588920 podman[77844]: 2026-01-20 13:58:08.107882258 +0000 UTC m=+0.251600626 container init b5aaabf9e81d8b9c0ebf4f99e7237e601f60096db3310192c81d12433431e813 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 08:58:08 np0005588920 podman[77844]: 2026-01-20 13:58:08.113101097 +0000 UTC m=+0.256819435 container start b5aaabf9e81d8b9c0ebf4f99e7237e601f60096db3310192c81d12433431e813 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:08 np0005588920 bash[77844]: b5aaabf9e81d8b9c0ebf4f99e7237e601f60096db3310192c81d12433431e813
Jan 20 08:58:08 np0005588920 ceph-mgr[77507]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:08.124+0000 7f425d6d4140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 20 08:58:08 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'dashboard'
Jan 20 08:58:08 np0005588920 systemd[1]: Started Ceph crash.compute-2 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.509+0000 7fd88c61e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.509+0000 7fd88c61e640 -1 AuthRegistry(0x7fd8840675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.510+0000 7fd88c61e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.510+0000 7fd88c61e640 -1 AuthRegistry(0x7fd88c61d000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.510+0000 7fd88a393640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.511+0000 7fd889b92640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.512+0000 7fd88ab94640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: 2026-01-20T13:58:08.512+0000 7fd88c61e640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 20 08:58:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-2[77860]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 20 08:58:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.829312292 +0000 UTC m=+0.034550772 container create 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:08 np0005588920 systemd[1]: Started libpod-conmon-540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b.scope.
Jan 20 08:58:08 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.908353752 +0000 UTC m=+0.113592252 container init 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.814073587 +0000 UTC m=+0.019312097 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.914728212 +0000 UTC m=+0.119966692 container start 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.917889617 +0000 UTC m=+0.123128097 container attach 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:08 np0005588920 suspicious_pike[78030]: 167 167
Jan 20 08:58:08 np0005588920 systemd[1]: libpod-540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b.scope: Deactivated successfully.
Jan 20 08:58:08 np0005588920 conmon[78030]: conmon 540d01199290b52b0721 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b.scope/container/memory.events
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.922554271 +0000 UTC m=+0.127792751 container died 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 20 08:58:08 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d85895790f4158b52bfe77603a40e0330820be41acf7a0e665e39df522c74247-merged.mount: Deactivated successfully.
Jan 20 08:58:08 np0005588920 podman[78014]: 2026-01-20 13:58:08.972143445 +0000 UTC m=+0.177381965 container remove 540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_pike, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:08 np0005588920 systemd[1]: libpod-conmon-540d01199290b52b0721a318af1bcfe765a49325e000b675d80a466a7a25ec7b.scope: Deactivated successfully.
Jan 20 08:58:09 np0005588920 podman[78056]: 2026-01-20 13:58:09.144545746 +0000 UTC m=+0.060181107 container create 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True)
Jan 20 08:58:09 np0005588920 systemd[1]: Started libpod-conmon-399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1.scope.
Jan 20 08:58:09 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:09 np0005588920 podman[78056]: 2026-01-20 13:58:09.122559639 +0000 UTC m=+0.038195010 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:09 np0005588920 podman[78056]: 2026-01-20 13:58:09.232940506 +0000 UTC m=+0.148575847 container init 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Jan 20 08:58:09 np0005588920 podman[78056]: 2026-01-20 13:58:09.251258184 +0000 UTC m=+0.166893555 container start 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 08:58:09 np0005588920 podman[78056]: 2026-01-20 13:58:09.256139745 +0000 UTC m=+0.171775166 container attach 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'devicehealth'
Jan 20 08:58:09 np0005588920 ceph-mgr[77507]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 20 08:58:09 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'diskprediction_local'
Jan 20 08:58:09 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:09.762+0000 7f425d6d4140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 20 08:58:09 np0005588920 inspiring_moser[78072]: --> passed data devices: 0 physical, 1 LVM
Jan 20 08:58:09 np0005588920 inspiring_moser[78072]: --> relative data size: 1.0
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 36aff1f5-bc44-4633-b417-95c5b1ee6391
Jan 20 08:58:10 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 20 08:58:10 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 20 08:58:10 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]:  from numpy import show_config as show_numpy_config
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3413961177' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:58:10 np0005588920 ceph-mgr[77507]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 20 08:58:10 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'influx'
Jan 20 08:58:10 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:10.279+0000 7f425d6d4140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"} v 0) v1
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3257799028' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]: dispatch
Jan 20 08:58:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e35 e35: 3 total, 2 up, 3 in
Jan 20 08:58:10 np0005588920 ceph-mgr[77507]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 20 08:58:10 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'insights'
Jan 20 08:58:10 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:10.526+0000 7f425d6d4140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:58:10 np0005588920 lvm[78120]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:58:10 np0005588920 lvm[78120]: VG ceph_vg0 finished
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:10 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 20 08:58:10 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'iostat'
Jan 20 08:58:11 np0005588920 ceph-mgr[77507]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'k8sevents'
Jan 20 08:58:11 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:10.999+0000 7f425d6d4140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/672023475' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 20 08:58:11 np0005588920 inspiring_moser[78072]: stderr: got monmap epoch 3
Jan 20 08:58:11 np0005588920 inspiring_moser[78072]: --> Creating keyring file for osd.2
Jan 20 08:58:11 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 20 08:58:11 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 20 08:58:11 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 36aff1f5-bc44-4633-b417-95c5b1ee6391 --setuser ceph --setgroup ceph
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3413961177' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/3257799028' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]: dispatch
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]: dispatch
Jan 20 08:58:11 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]': finished
Jan 20 08:58:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020053098 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:12 np0005588920 ceph-mon[77148]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 20 08:58:12 np0005588920 ceph-mon[77148]: Cluster is now healthy
Jan 20 08:58:12 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'localpool'
Jan 20 08:58:12 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'mds_autoscaler'
Jan 20 08:58:13 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'mirroring'
Jan 20 08:58:13 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'nfs'
Jan 20 08:58:14 np0005588920 ceph-mgr[77507]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 20 08:58:14 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'orchestrator'
Jan 20 08:58:14 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:14.615+0000 7f425d6d4140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 20 08:58:14 np0005588920 inspiring_moser[78072]: stderr: 2026-01-20T13:58:11.168+0000 7f5d9daf2740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:58:14 np0005588920 inspiring_moser[78072]: stderr: 2026-01-20T13:58:11.168+0000 7f5d9daf2740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:58:14 np0005588920 inspiring_moser[78072]: stderr: 2026-01-20T13:58:11.168+0000 7f5d9daf2740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:58:14 np0005588920 inspiring_moser[78072]: stderr: 2026-01-20T13:58:11.168+0000 7f5d9daf2740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 20 08:58:14 np0005588920 inspiring_moser[78072]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 20 08:58:15 np0005588920 inspiring_moser[78072]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 20 08:58:15 np0005588920 systemd[1]: libpod-399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1.scope: Deactivated successfully.
Jan 20 08:58:15 np0005588920 systemd[1]: libpod-399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1.scope: Consumed 2.369s CPU time.
Jan 20 08:58:15 np0005588920 podman[78056]: 2026-01-20 13:58:15.131393438 +0000 UTC m=+6.047028779 container died 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 20 08:58:15 np0005588920 systemd[1]: var-lib-containers-storage-overlay-7c72e09207f10924141a4234a9ed42b5096656d1e34da34732fc32437355a046-merged.mount: Deactivated successfully.
Jan 20 08:58:15 np0005588920 podman[78056]: 2026-01-20 13:58:15.272562316 +0000 UTC m=+6.188197657 container remove 399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_moser, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 20 08:58:15 np0005588920 systemd[1]: libpod-conmon-399f857734058b806bb04ea785f674edd956bf13d2bdb70729d8671be6e993d1.scope: Deactivated successfully.
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'osd_perf_query'
Jan 20 08:58:15 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:15.283+0000 7f425d6d4140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1467956015' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/1467956015' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'osd_support'
Jan 20 08:58:15 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:15.556+0000 7f425d6d4140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e36 e36: 3 total, 2 up, 3 in
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'pg_autoscaler'
Jan 20 08:58:15 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:15.805+0000 7f425d6d4140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588920 podman[79174]: 2026-01-20 13:58:15.935571652 +0000 UTC m=+0.057176747 container create f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 20 08:58:15 np0005588920 systemd[1]: Started libpod-conmon-f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0.scope.
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:15.914421038 +0000 UTC m=+0.036026113 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:16 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:16.042340201 +0000 UTC m=+0.163945336 container init f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:16.050823037 +0000 UTC m=+0.172428102 container start f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:16.054399043 +0000 UTC m=+0.176004128 container attach f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:58:16 np0005588920 jolly_einstein[79191]: 167 167
Jan 20 08:58:16 np0005588920 systemd[1]: libpod-f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0.scope: Deactivated successfully.
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:16.056433047 +0000 UTC m=+0.178038132 container died f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:16 np0005588920 ceph-mgr[77507]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'progress'
Jan 20 08:58:16 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:16.078+0000 7f425d6d4140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588920 systemd[1]: var-lib-containers-storage-overlay-fbc6bf337d02b0e9943aabdc85948df440d916aff15b6210c89217269c846179-merged.mount: Deactivated successfully.
Jan 20 08:58:16 np0005588920 podman[79174]: 2026-01-20 13:58:16.10526043 +0000 UTC m=+0.226865525 container remove f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_einstein, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 20 08:58:16 np0005588920 systemd[1]: libpod-conmon-f2b099066deea972f1e286a25fd4da3cd0c0f8f6689e3314522e762b895fa3a0.scope: Deactivated successfully.
Jan 20 08:58:16 np0005588920 podman[79214]: 2026-01-20 13:58:16.266443032 +0000 UTC m=+0.042324290 container create 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:16 np0005588920 systemd[1]: Started libpod-conmon-8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9.scope.
Jan 20 08:58:16 np0005588920 ceph-mgr[77507]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'prometheus'
Jan 20 08:58:16 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:16.318+0000 7f425d6d4140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588920 podman[79214]: 2026-01-20 13:58:16.245182575 +0000 UTC m=+0.021063863 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:16 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26ec023a65f8d920a2c3422ce3e91b709549756aed174b9ed06b50c883a0bf97/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26ec023a65f8d920a2c3422ce3e91b709549756aed174b9ed06b50c883a0bf97/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26ec023a65f8d920a2c3422ce3e91b709549756aed174b9ed06b50c883a0bf97/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26ec023a65f8d920a2c3422ce3e91b709549756aed174b9ed06b50c883a0bf97/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:16 np0005588920 podman[79214]: 2026-01-20 13:58:16.368057045 +0000 UTC m=+0.143938393 container init 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:16 np0005588920 podman[79214]: 2026-01-20 13:58:16.380573029 +0000 UTC m=+0.156454287 container start 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:16 np0005588920 podman[79214]: 2026-01-20 13:58:16.383972459 +0000 UTC m=+0.159853757 container attach 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/3060958510' entity='client.admin' 
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e37 e37: 3 total, 2 up, 3 in
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054710 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:17 np0005588920 cool_meitner[79231]: {
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:    "2": [
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:        {
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "devices": [
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "/dev/loop3"
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            ],
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "lv_name": "ceph_lv0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "lv_size": "7511998464",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=aZBXRf-5VFi-lyGe-Oq4U-6c95-eEU6-kCNDa0,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=e399cf45-e6b6-5393-99f1-75c601d3f188,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=36aff1f5-bc44-4633-b417-95c5b1ee6391,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "lv_uuid": "aZBXRf-5VFi-lyGe-Oq4U-6c95-eEU6-kCNDa0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "name": "ceph_lv0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "tags": {
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.block_uuid": "aZBXRf-5VFi-lyGe-Oq4U-6c95-eEU6-kCNDa0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.cephx_lockbox_secret": "",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.cluster_fsid": "e399cf45-e6b6-5393-99f1-75c601d3f188",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.cluster_name": "ceph",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.crush_device_class": "",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.encrypted": "0",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.osd_fsid": "36aff1f5-bc44-4633-b417-95c5b1ee6391",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.osd_id": "2",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.type": "block",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:                "ceph.vdo": "0"
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            },
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "type": "block",
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:            "vg_name": "ceph_vg0"
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:        }
Jan 20 08:58:17 np0005588920 cool_meitner[79231]:    ]
Jan 20 08:58:17 np0005588920 cool_meitner[79231]: }
Jan 20 08:58:17 np0005588920 systemd[1]: libpod-8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9.scope: Deactivated successfully.
Jan 20 08:58:17 np0005588920 podman[79214]: 2026-01-20 13:58:17.168119159 +0000 UTC m=+0.944000417 container died 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay-26ec023a65f8d920a2c3422ce3e91b709549756aed174b9ed06b50c883a0bf97-merged.mount: Deactivated successfully.
Jan 20 08:58:17 np0005588920 podman[79214]: 2026-01-20 13:58:17.222644354 +0000 UTC m=+0.998525612 container remove 8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_meitner, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:17 np0005588920 systemd[1]: libpod-conmon-8d549355bfffb0652dd3749aa19a8fd173716972c4b70259b841acb2c113a5f9.scope: Deactivated successfully.
Jan 20 08:58:17 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:17.342+0000 7f425d6d4140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588920 ceph-mgr[77507]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'rbd_support'
Jan 20 08:58:17 np0005588920 ceph-mgr[77507]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'restful'
Jan 20 08:58:17 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:17.644+0000 7f425d6d4140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: Saving service ingress.rgw.default spec with placement count:2
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.82644956 +0000 UTC m=+0.038876919 container create eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:17 np0005588920 systemd[1]: Started libpod-conmon-eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77.scope.
Jan 20 08:58:17 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.896811718 +0000 UTC m=+0.109239117 container init eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.90211968 +0000 UTC m=+0.114547049 container start eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.808420119 +0000 UTC m=+0.020847518 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.906849036 +0000 UTC m=+0.119276445 container attach eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Jan 20 08:58:17 np0005588920 elated_benz[79413]: 167 167
Jan 20 08:58:17 np0005588920 systemd[1]: libpod-eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77.scope: Deactivated successfully.
Jan 20 08:58:17 np0005588920 conmon[79413]: conmon eb36b2ea8bae4b5c8f71 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77.scope/container/memory.events
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.909322252 +0000 UTC m=+0.121749681 container died eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay-e4d5dc4e4087ff34a27b0b5dd99fcd0d13f4f5705ead65f4057be05ed7dc4bae-merged.mount: Deactivated successfully.
Jan 20 08:58:17 np0005588920 podman[79395]: 2026-01-20 13:58:17.947277625 +0000 UTC m=+0.159704994 container remove eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_benz, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 20 08:58:17 np0005588920 systemd[1]: libpod-conmon-eb36b2ea8bae4b5c8f71d77515f14a78c890561ff532cf6e3289012533ff1e77.scope: Deactivated successfully.
Jan 20 08:58:18 np0005588920 podman[79444]: 2026-01-20 13:58:18.2880469 +0000 UTC m=+0.042786583 container create 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 20 08:58:18 np0005588920 systemd[1]: Started libpod-conmon-6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f.scope.
Jan 20 08:58:18 np0005588920 podman[79444]: 2026-01-20 13:58:18.267723788 +0000 UTC m=+0.022463481 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:18 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:18 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'rgw'
Jan 20 08:58:18 np0005588920 podman[79444]: 2026-01-20 13:58:18.464575352 +0000 UTC m=+0.219315095 container init 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Jan 20 08:58:18 np0005588920 podman[79444]: 2026-01-20 13:58:18.472592986 +0000 UTC m=+0.227332669 container start 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 20 08:58:18 np0005588920 podman[79444]: 2026-01-20 13:58:18.477534548 +0000 UTC m=+0.232274301 container attach 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 20 08:58:18 np0005588920 ceph-mon[77148]: Deploying daemon osd.2 on compute-2
Jan 20 08:58:19 np0005588920 ceph-mgr[77507]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 20 08:58:19 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'rook'
Jan 20 08:58:19 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:19.072+0000 7f425d6d4140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 20 08:58:19 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test[79460]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 20 08:58:19 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test[79460]:                            [--no-systemd] [--no-tmpfs]
Jan 20 08:58:19 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test[79460]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 20 08:58:19 np0005588920 systemd[1]: libpod-6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f.scope: Deactivated successfully.
Jan 20 08:58:19 np0005588920 podman[79444]: 2026-01-20 13:58:19.112938647 +0000 UTC m=+0.867678290 container died 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 20 08:58:19 np0005588920 systemd[1]: var-lib-containers-storage-overlay-18a96a50eae77e440ffa9d0f0217a430141569d45e5b55db6dc7ee6364019bfa-merged.mount: Deactivated successfully.
Jan 20 08:58:19 np0005588920 podman[79444]: 2026-01-20 13:58:19.167054592 +0000 UTC m=+0.921794235 container remove 6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate-test, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:19 np0005588920 systemd[1]: libpod-conmon-6d3958a2b560efc17396b78666551c331c3bb35c158a36903f024842002c285f.scope: Deactivated successfully.
Jan 20 08:58:19 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:19 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:19 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e2 new map
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:19.644841+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e38 e38: 3 total, 2 up, 3 in
Jan 20 08:58:19 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 20 08:58:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:19 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:19 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:19 np0005588920 systemd[1]: Starting Ceph osd.2 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:20 np0005588920 podman[79618]: 2026-01-20 13:58:20.228041149 +0000 UTC m=+0.029506248 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:20 np0005588920 podman[79618]: 2026-01-20 13:58:20.473257684 +0000 UTC m=+0.274722733 container create fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 08:58:20 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:20 np0005588920 podman[79618]: 2026-01-20 13:58:20.59863149 +0000 UTC m=+0.400096499 container init fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:20 np0005588920 podman[79618]: 2026-01-20 13:58:20.605554635 +0000 UTC m=+0.407019634 container start fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:20 np0005588920 podman[79618]: 2026-01-20 13:58:20.612564612 +0000 UTC m=+0.414029671 container attach fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:20 np0005588920 ceph-mon[77148]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'selftest'
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:21.161+0000 7f425d6d4140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'snap_schedule'
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:21.402+0000 7f425d6d4140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:21 np0005588920 bash[79618]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate[79634]: --> ceph-volume raw activate successful for osd ID: 2
Jan 20 08:58:21 np0005588920 bash[79618]: --> ceph-volume raw activate successful for osd ID: 2
Jan 20 08:58:21 np0005588920 systemd[1]: libpod-fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05.scope: Deactivated successfully.
Jan 20 08:58:21 np0005588920 podman[79618]: 2026-01-20 13:58:21.531409777 +0000 UTC m=+1.332874786 container died fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 08:58:21 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c9bb262c385775286f712cae2a1c9dd77b0cb164a9dc4afde404e44090f48b32-merged.mount: Deactivated successfully.
Jan 20 08:58:21 np0005588920 podman[79618]: 2026-01-20 13:58:21.599958547 +0000 UTC m=+1.401423576 container remove fbcbf6cdb8909583444ca58a12fd5e06d17d5558bd78d6f598da7bbb4f0b3d05 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True)
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:21.655+0000 7f425d6d4140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'stats'
Jan 20 08:58:21 np0005588920 ceph-mon[77148]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:21 np0005588920 podman[79801]: 2026-01-20 13:58:21.878974674 +0000 UTC m=+0.056398307 container create 585a0e7d4bc77578a493215d05b63f4a42bd1df58fe33aac7639a834ccaa48cb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:21 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'status'
Jan 20 08:58:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b520fa6a0e00e9ad69665cc7a3521170a683e24a86602ae94a5980ea8e82fe93/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b520fa6a0e00e9ad69665cc7a3521170a683e24a86602ae94a5980ea8e82fe93/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b520fa6a0e00e9ad69665cc7a3521170a683e24a86602ae94a5980ea8e82fe93/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b520fa6a0e00e9ad69665cc7a3521170a683e24a86602ae94a5980ea8e82fe93/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b520fa6a0e00e9ad69665cc7a3521170a683e24a86602ae94a5980ea8e82fe93/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:21 np0005588920 podman[79801]: 2026-01-20 13:58:21.854003247 +0000 UTC m=+0.031426960 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:21 np0005588920 podman[79801]: 2026-01-20 13:58:21.950344499 +0000 UTC m=+0.127768122 container init 585a0e7d4bc77578a493215d05b63f4a42bd1df58fe33aac7639a834ccaa48cb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:21 np0005588920 podman[79801]: 2026-01-20 13:58:21.95637605 +0000 UTC m=+0.133799673 container start 585a0e7d4bc77578a493215d05b63f4a42bd1df58fe33aac7639a834ccaa48cb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:21 np0005588920 bash[79801]: 585a0e7d4bc77578a493215d05b63f4a42bd1df58fe33aac7639a834ccaa48cb
Jan 20 08:58:21 np0005588920 systemd[1]: Started Ceph osd.2 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: pidfile_write: ignore empty --pid-file
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f881b3c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f881b3c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f881b3c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f881b3c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f88fc9000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f88fc9000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f88fc9000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f88fc9000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f88fc9000 /var/lib/ceph/osd/ceph-2/block) close
Jan 20 08:58:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:22 np0005588920 ceph-mgr[77507]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'telegraf'
Jan 20 08:58:22 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:22.173+0000 7f425d6d4140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f881b3c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 20 08:58:22 np0005588920 ceph-mgr[77507]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'telemetry'
Jan 20 08:58:22 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:22.402+0000 7f425d6d4140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: load: jerasure load: lrc 
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) close
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.69896465 +0000 UTC m=+0.049843192 container create 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:58:22 np0005588920 systemd[1]: Started libpod-conmon-80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87.scope.
Jan 20 08:58:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.676456459 +0000 UTC m=+0.027335041 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:22 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.792798564 +0000 UTC m=+0.143677176 container init 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.803028777 +0000 UTC m=+0.153907309 container start 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.807215939 +0000 UTC m=+0.158094471 container attach 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:22 np0005588920 sweet_rosalind[79999]: 167 167
Jan 20 08:58:22 np0005588920 systemd[1]: libpod-80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87.scope: Deactivated successfully.
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.809659734 +0000 UTC m=+0.160538276 container died 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:22 np0005588920 systemd[1]: var-lib-containers-storage-overlay-af79c20140ec055f52d8f136f44bf8bf25238a48c1c313c102ac1c1d5ca90218-merged.mount: Deactivated successfully.
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:58:22 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) close
Jan 20 08:58:22 np0005588920 podman[79983]: 2026-01-20 13:58:22.854804479 +0000 UTC m=+0.205683021 container remove 80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 20 08:58:22 np0005588920 systemd[1]: libpod-conmon-80d8d3efdc1a26321983e5f8078056499ab7401619ae48350947e00fe506fa87.scope: Deactivated successfully.
Jan 20 08:58:23 np0005588920 ceph-mgr[77507]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'test_orchestrator'
Jan 20 08:58:23 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:23.001+0000 7f425d6d4140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.048057877 +0000 UTC m=+0.064910543 container create 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:23 np0005588920 systemd[1]: Started libpod-conmon-646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622.scope.
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.016472214 +0000 UTC m=+0.033324940 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:23 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8cc0b7ddfc23017384e7b29c69ed7df39327e3e99549f995fe2480c757d21c2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8cc0b7ddfc23017384e7b29c69ed7df39327e3e99549f995fe2480c757d21c2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8cc0b7ddfc23017384e7b29c69ed7df39327e3e99549f995fe2480c757d21c2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8cc0b7ddfc23017384e7b29c69ed7df39327e3e99549f995fe2480c757d21c2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.13695558 +0000 UTC m=+0.153808226 container init 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904ec00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs mount
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs mount shared_bdev_used = 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.148389474 +0000 UTC m=+0.165242100 container start 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Git sha 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DB SUMMARY
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DB Session ID:  Z3M9POKJKVPM9U9YI9F8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                     Options.env: 0x562f89051d50
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                Options.info_log: 0x562f8823ee80
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                 Options.wal_dir: db.wal
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.write_buffer_manager: 0x562f891646e0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.row_cache: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.wal_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.wal_compression: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_background_jobs: 4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Compression algorithms supported:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZSTD supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.160901878 +0000 UTC m=+0.177754504 container attach 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f500)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882262d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f4c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88226850#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f4c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88226850#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823f4c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88226850#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 379ae5f5-fd8d-428c-a265-8994967355c5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503183111, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503183387, "job": 1, "event": "recovery_finished"}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: freelist init
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: freelist _read_cfg
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs umount
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) close
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bdev(0x562f8904f400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs mount
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluefs mount shared_bdev_used = 4718592
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Git sha 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DB SUMMARY
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DB Session ID:  Z3M9POKJKVPM9U9YI9F9
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                     Options.env: 0x562f8838f960
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                Options.info_log: 0x562f88242500
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                                 Options.wal_dir: db.wal
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.write_buffer_manager: 0x562f891648c0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.row_cache: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                              Options.wal_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.wal_compression: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_background_jobs: 4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Compression algorithms supported:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZSTD supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e020)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f882274b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e0a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88227770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e0a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88227770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:           Options.merge_operator: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562f8823e0a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562f88227770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.compression: LZ4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.num_levels: 7
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 379ae5f5-fd8d-428c-a265-8994967355c5
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503450184, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503454363, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917503, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "379ae5f5-fd8d-428c-a265-8994967355c5", "db_session_id": "Z3M9POKJKVPM9U9YI9F9", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503457445, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917503, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "379ae5f5-fd8d-428c-a265-8994967355c5", "db_session_id": "Z3M9POKJKVPM9U9YI9F9", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503460464, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917503, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "379ae5f5-fd8d-428c-a265-8994967355c5", "db_session_id": "Z3M9POKJKVPM9U9YI9F9", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917503461978, "job": 1, "event": "recovery_finished"}
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562f88262700
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: DB pointer 0x562f89151a00
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 460.80 MB usag
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: _get_class not permitted to load lua
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: _get_class not permitted to load sdk
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: _get_class not permitted to load test_remote_reads
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 load_pgs
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 load_pgs opened 0 pgs
Jan 20 08:58:23 np0005588920 ceph-osd[79820]: osd.2 0 log_to_monitors true
Jan 20 08:58:23 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2[79816]: 2026-01-20T13:58:23.499+0000 7f540e313740 -1 osd.2 0 log_to_monitors true
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 20 08:58:23 np0005588920 ceph-mgr[77507]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:23.670+0000 7f425d6d4140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'volumes'
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/2618177133' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/2618177133' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Jan 20 08:58:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]: {
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:    "36aff1f5-bc44-4633-b417-95c5b1ee6391": {
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:        "ceph_fsid": "e399cf45-e6b6-5393-99f1-75c601d3f188",
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:        "osd_id": 2,
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:        "osd_uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391",
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:        "type": "bluestore"
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]:    }
Jan 20 08:58:23 np0005588920 inspiring_heyrovsky[80042]: }
Jan 20 08:58:23 np0005588920 systemd[1]: libpod-646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622.scope: Deactivated successfully.
Jan 20 08:58:23 np0005588920 podman[80026]: 2026-01-20 13:58:23.996039868 +0000 UTC m=+1.012892484 container died 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 20 08:58:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay-e8cc0b7ddfc23017384e7b29c69ed7df39327e3e99549f995fe2480c757d21c2-merged.mount: Deactivated successfully.
Jan 20 08:58:24 np0005588920 podman[80026]: 2026-01-20 13:58:24.05677467 +0000 UTC m=+1.073627336 container remove 646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=inspiring_heyrovsky, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:24 np0005588920 systemd[1]: libpod-conmon-646e47759d138ea8afc4131b1e04d8ea1ddab052f995d121983dd0155d0f5622.scope: Deactivated successfully.
Jan 20 08:58:24 np0005588920 ceph-mgr[77507]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588920 ceph-mgr[77507]: mgr[py] Loading python module 'zabbix'
Jan 20 08:58:24 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:24.366+0000 7f425d6d4140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 20 08:58:24 np0005588920 ceph-mgr[77507]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-2-gunjko[77503]: 2026-01-20T13:58:24.607+0000 7f425d6d4140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588920 ceph-mgr[77507]: ms_deliver_dispatch: unhandled message 0x55d00b78b080 mon_map magic: 0 v1 from mon.1 v2:192.168.122.102:3300/0
Jan 20 08:58:24 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 done with init, starting boot process
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 start_boot
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 20 08:58:24 np0005588920 ceph-osd[79820]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:25 np0005588920 podman[80706]: 2026-01-20 13:58:25.309008993 +0000 UTC m=+0.063932768 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:25 np0005588920 podman[80706]: 2026-01-20 13:58:25.628240023 +0000 UTC m=+0.383163748 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:25 np0005588920 ceph-mon[77148]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.684350731 +0000 UTC m=+0.060965318 container create 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.64984268 +0000 UTC m=+0.026457297 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:26 np0005588920 systemd[1]: Started libpod-conmon-4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf.scope.
Jan 20 08:58:26 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.831302413 +0000 UTC m=+0.207916980 container init 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.841905696 +0000 UTC m=+0.218520273 container start 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 20 08:58:26 np0005588920 wonderful_murdock[80948]: 167 167
Jan 20 08:58:26 np0005588920 systemd[1]: libpod-4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf.scope: Deactivated successfully.
Jan 20 08:58:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.860523062 +0000 UTC m=+0.237137689 container attach 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.861246352 +0000 UTC m=+0.237860929 container died 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay-50af96b3c03997b60da70512215df99e366baa243bb7e29c2b386725abb3e8af-merged.mount: Deactivated successfully.
Jan 20 08:58:26 np0005588920 podman[80932]: 2026-01-20 13:58:26.962867034 +0000 UTC m=+0.339481641 container remove 4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 20 08:58:26 np0005588920 systemd[1]: libpod-conmon-4da77e3067383126ee01c42529a811353301e99e281636448f03bfd2babacfbf.scope: Deactivated successfully.
Jan 20 08:58:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:27 np0005588920 podman[80972]: 2026-01-20 13:58:27.179227399 +0000 UTC m=+0.064177604 container create c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:27 np0005588920 podman[80972]: 2026-01-20 13:58:27.157225892 +0000 UTC m=+0.042176087 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:27 np0005588920 systemd[1]: Started libpod-conmon-c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41.scope.
Jan 20 08:58:27 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:27 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5961b39ef1924efb0ed91efb852a2311d1900e62f8cda6455d52993ba11d7421/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:27 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5961b39ef1924efb0ed91efb852a2311d1900e62f8cda6455d52993ba11d7421/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:27 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5961b39ef1924efb0ed91efb852a2311d1900e62f8cda6455d52993ba11d7421/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:27 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5961b39ef1924efb0ed91efb852a2311d1900e62f8cda6455d52993ba11d7421/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:27 np0005588920 podman[80972]: 2026-01-20 13:58:27.31602635 +0000 UTC m=+0.200976595 container init c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 08:58:27 np0005588920 podman[80972]: 2026-01-20 13:58:27.328873443 +0000 UTC m=+0.213823648 container start c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 08:58:27 np0005588920 podman[80972]: 2026-01-20 13:58:27.358032651 +0000 UTC m=+0.242982906 container attach c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 20 08:58:28 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/260003973' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]: [
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:    {
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "available": false,
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "ceph_device": false,
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "lsm_data": {},
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "lvs": [],
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "path": "/dev/sr0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "rejected_reasons": [
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "Has a FileSystem",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "Insufficient space (<5GB)"
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        ],
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        "sys_api": {
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "actuators": null,
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "device_nodes": "sr0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "devname": "sr0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "human_readable_size": "482.00 KB",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "id_bus": "ata",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "model": "QEMU DVD-ROM",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "nr_requests": "2",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "parent": "/dev/sr0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "partitions": {},
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "path": "/dev/sr0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "removable": "1",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "rev": "2.5+",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "ro": "0",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "rotational": "1",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "sas_address": "",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "sas_device_handle": "",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "scheduler_mode": "mq-deadline",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "sectors": 0,
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "sectorsize": "2048",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "size": 493568.0,
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "support_discard": "2048",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "type": "disk",
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:            "vendor": "QEMU"
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:        }
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]:    }
Jan 20 08:58:28 np0005588920 intelligent_ptolemy[80988]: ]
Jan 20 08:58:28 np0005588920 systemd[1]: libpod-c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41.scope: Deactivated successfully.
Jan 20 08:58:28 np0005588920 podman[80972]: 2026-01-20 13:58:28.533648729 +0000 UTC m=+1.418598924 container died c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:28 np0005588920 systemd[1]: libpod-c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41.scope: Consumed 1.202s CPU time.
Jan 20 08:58:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay-5961b39ef1924efb0ed91efb852a2311d1900e62f8cda6455d52993ba11d7421-merged.mount: Deactivated successfully.
Jan 20 08:58:28 np0005588920 podman[80972]: 2026-01-20 13:58:28.602864107 +0000 UTC m=+1.487814302 container remove c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_ptolemy, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 20 08:58:28 np0005588920 systemd[1]: libpod-conmon-c7095a7d827280d8abfdfcdf77abc021778a99cb36f72a478e03de7d04a64a41.scope: Deactivated successfully.
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 24.412 iops: 6249.450 elapsed_sec: 0.480
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: log_channel(cluster) log [WRN] : OSD bench result of 6249.450009 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 0 waiting for initial osdmap
Jan 20 08:58:28 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2[79816]: 2026-01-20T13:58:28.804+0000 7f540aaaa640 -1 osd.2 0 waiting for initial osdmap
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 check_osdmap_features require_osd_release unknown -> reef
Jan 20 08:58:28 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-2[79816]: 2026-01-20T13:58:28.838+0000 7f54058bb640 -1 osd.2 40 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 set_numa_affinity not setting numa affinity
Jan 20 08:58:28 np0005588920 ceph-osd[79820]: osd.2 40 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: Unable to set osd_memory_target on compute-2 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: Updating compute-0:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: Updating compute-1:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: Updating compute-2:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: OSD bench result of 6249.450009 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 20 08:58:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 41 state: booting -> active
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.19( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[6.1b( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=41) [2] r=0 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.1d( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[6.1( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=41) [2] r=0 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.3( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.6( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.2( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.c( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[7.14( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.10( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.14( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 41 pg[7.1d( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-mon[77148]: osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873] boot
Jan 20 08:58:30 np0005588920 ceph-mon[77148]: Updating compute-2:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588920 ceph-mon[77148]: Updating compute-0:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588920 ceph-mon[77148]: Updating compute-1:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1e( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1f( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1c( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.12( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.17( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.15( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.15( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.16( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.1f( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.12( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.1d( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.9( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.8( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=0/0 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.9( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.1a( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.14( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.b( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.8( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.19( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.10( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.c( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1b( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=41) [2] r=0 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1d( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.0( empty local-lis/les=41/42 n=0 ec=17/17 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.3( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=41) [2] r=0 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=41/42 n=0 ec=20/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.6( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.d( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=41) [2] r=0 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=0/0 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1f( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1c( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.1e( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.17( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[6.12( empty local-lis/les=41/42 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.15( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.16( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.12( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41) [2] r=0 lpr=42 pi=[32,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.1a( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41) [2] r=0 lpr=42 pi=[36,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41) [2] r=0 lpr=42 pi=[28,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:31 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Jan 20 08:58:31 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:34 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1d deep-scrub starts
Jan 20 08:58:34 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1d deep-scrub ok
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ktpnzt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ktpnzt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.917003862 +0000 UTC m=+0.036039420 container create d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:37 np0005588920 systemd[1]: Started libpod-conmon-d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278.scope.
Jan 20 08:58:37 np0005588920 systemd[72686]: Starting Mark boot as successful...
Jan 20 08:58:37 np0005588920 systemd[72686]: Finished Mark boot as successful.
Jan 20 08:58:37 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.969871172 +0000 UTC m=+0.088906720 container init d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.977364184 +0000 UTC m=+0.096399762 container start d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.980850157 +0000 UTC m=+0.099885715 container attach d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:58:37 np0005588920 nostalgic_elgamal[83166]: 167 167
Jan 20 08:58:37 np0005588920 systemd[1]: libpod-d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278.scope: Deactivated successfully.
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.985075601 +0000 UTC m=+0.104111159 container died d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Jan 20 08:58:37 np0005588920 podman[83150]: 2026-01-20 13:58:37.902528443 +0000 UTC m=+0.021564051 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:38 np0005588920 systemd[1]: var-lib-containers-storage-overlay-380c70918a4aaa912016cf268cf41e2759f0287fd3ea2df5f334901797de50cb-merged.mount: Deactivated successfully.
Jan 20 08:58:38 np0005588920 podman[83150]: 2026-01-20 13:58:38.019427814 +0000 UTC m=+0.138463372 container remove d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_elgamal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 20 08:58:38 np0005588920 systemd[1]: libpod-conmon-d4d2eaff8d79c62a6d4e8f3ddc59ba8b17d5bb046a334cdb20a73f3a59047278.scope: Deactivated successfully.
Jan 20 08:58:38 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:38 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:38 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:38 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:38 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:38 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:38 np0005588920 systemd[1]: Starting Ceph rgw.rgw.compute-2.ktpnzt for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:38 np0005588920 podman[83306]: 2026-01-20 13:58:38.909511435 +0000 UTC m=+0.102452794 container create 2d4dbe693f6c5b4078dc6e3c6ca3f6bc9155a2bc2694d6e0cb7d0a82fc7eae44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-2-ktpnzt, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 20 08:58:38 np0005588920 ceph-mon[77148]: Deploying daemon rgw.rgw.compute-2.ktpnzt on compute-2
Jan 20 08:58:38 np0005588920 podman[83306]: 2026-01-20 13:58:38.827936583 +0000 UTC m=+0.020877962 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d646cc084f9c829b1ceb41fc6b3b125896692bd8e5494eb0d67978ed98e5d22a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d646cc084f9c829b1ceb41fc6b3b125896692bd8e5494eb0d67978ed98e5d22a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d646cc084f9c829b1ceb41fc6b3b125896692bd8e5494eb0d67978ed98e5d22a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d646cc084f9c829b1ceb41fc6b3b125896692bd8e5494eb0d67978ed98e5d22a/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.ktpnzt supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:38 np0005588920 podman[83306]: 2026-01-20 13:58:38.979149257 +0000 UTC m=+0.172090636 container init 2d4dbe693f6c5b4078dc6e3c6ca3f6bc9155a2bc2694d6e0cb7d0a82fc7eae44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-2-ktpnzt, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 08:58:38 np0005588920 podman[83306]: 2026-01-20 13:58:38.985996151 +0000 UTC m=+0.178937520 container start 2d4dbe693f6c5b4078dc6e3c6ca3f6bc9155a2bc2694d6e0cb7d0a82fc7eae44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-2-ktpnzt, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:38 np0005588920 bash[83306]: 2d4dbe693f6c5b4078dc6e3c6ca3f6bc9155a2bc2694d6e0cb7d0a82fc7eae44
Jan 20 08:58:38 np0005588920 systemd[1]: Started Ceph rgw.rgw.compute-2.ktpnzt for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:39 np0005588920 radosgw[83324]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:39 np0005588920 radosgw[83324]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 20 08:58:39 np0005588920 radosgw[83324]: framework: beast
Jan 20 08:58:39 np0005588920 radosgw[83324]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 20 08:58:39 np0005588920 radosgw[83324]: init_numa not setting numa affinity
Jan 20 08:58:39 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Jan 20 08:58:39 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.orkqpg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.orkqpg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: Deploying daemon rgw.rgw.compute-1.orkqpg on compute-1
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Jan 20 08:58:40 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 20 08:58:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 20 08:58:41 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 20 08:58:41 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 20 08:58:41 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Jan 20 08:58:41 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kiggjh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kiggjh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588920 ceph-mon[77148]: Deploying daemon rgw.rgw.compute-0.kiggjh on compute-0
Jan 20 08:58:42 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Jan 20 08:58:42 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Jan 20 08:58:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 20 08:58:43 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.jyxktq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.jyxktq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: Deploying daemon mds.cephfs.compute-2.jyxktq on compute-2
Jan 20 08:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.295182155 +0000 UTC m=+0.071443461 container create 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:44 np0005588920 systemd[1]: Started libpod-conmon-362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d.scope.
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.266671129 +0000 UTC m=+0.042932495 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:44 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.390007694 +0000 UTC m=+0.166268990 container init 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.400770193 +0000 UTC m=+0.177031469 container start 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.40328026 +0000 UTC m=+0.179541526 container attach 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 20 08:58:44 np0005588920 festive_torvalds[83542]: 167 167
Jan 20 08:58:44 np0005588920 systemd[1]: libpod-362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d.scope: Deactivated successfully.
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.40845596 +0000 UTC m=+0.184717236 container died 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d484950559a16ad7f0a606092f553ef260db108f1101872cee9b1ac4a7b5bf21-merged.mount: Deactivated successfully.
Jan 20 08:58:44 np0005588920 podman[83525]: 2026-01-20 13:58:44.455418222 +0000 UTC m=+0.231679508 container remove 362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_torvalds, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 20 08:58:44 np0005588920 systemd[1]: libpod-conmon-362bda55e5c8023a7e3b91d526ba923de236fee65da44ba87a5daa246478246d.scope: Deactivated successfully.
Jan 20 08:58:44 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:44 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:44 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:44 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:44 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:44 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:45 np0005588920 systemd[1]: Starting Ceph mds.cephfs.compute-2.jyxktq for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/418792044' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/418792044' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:45 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:45 np0005588920 podman[83685]: 2026-01-20 13:58:45.392969978 +0000 UTC m=+0.050719965 container create a8aac5b8bfb0d08df84190702790ed17334ee2d4469d178ef6c35a90168c643f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-2-jyxktq, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:58:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a29da782974ad346e4ae6988c40223314a38b462216be0e919a4f642b7f0e6a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a29da782974ad346e4ae6988c40223314a38b462216be0e919a4f642b7f0e6a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a29da782974ad346e4ae6988c40223314a38b462216be0e919a4f642b7f0e6a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a29da782974ad346e4ae6988c40223314a38b462216be0e919a4f642b7f0e6a/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.jyxktq supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:45 np0005588920 podman[83685]: 2026-01-20 13:58:45.375592561 +0000 UTC m=+0.033342578 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:45 np0005588920 podman[83685]: 2026-01-20 13:58:45.470540452 +0000 UTC m=+0.128290459 container init a8aac5b8bfb0d08df84190702790ed17334ee2d4469d178ef6c35a90168c643f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-2-jyxktq, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:45 np0005588920 podman[83685]: 2026-01-20 13:58:45.483766188 +0000 UTC m=+0.141516175 container start a8aac5b8bfb0d08df84190702790ed17334ee2d4469d178ef6c35a90168c643f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-2-jyxktq, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:45 np0005588920 bash[83685]: a8aac5b8bfb0d08df84190702790ed17334ee2d4469d178ef6c35a90168c643f
Jan 20 08:58:45 np0005588920 systemd[1]: Started Ceph mds.cephfs.compute-2.jyxktq for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:45 np0005588920 ceph-mds[83715]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:45 np0005588920 ceph-mds[83715]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 20 08:58:45 np0005588920 ceph-mds[83715]: main not setting numa affinity
Jan 20 08:58:45 np0005588920 ceph-mds[83715]: pidfile_write: ignore empty --pid-file
Jan 20 08:58:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-2-jyxktq[83711]: starting mds.cephfs.compute-2.jyxktq at 
Jan 20 08:58:45 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Updating MDS map to version 2 from mon.1
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e3 new map
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:19.644841+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.jyxktq{-1:24178} state up:standby seq 1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Updating MDS map to version 3 from mon.1
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Monitors have assigned me to become a standby.
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e4 new map
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:46.558090+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:creating seq 1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.znrafi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Updating MDS map to version 4 from mon.1
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.znrafi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: Deploying daemon mds.cephfs.compute-0.znrafi on compute-0
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x1
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x100
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x600
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x601
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x602
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x603
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x604
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x605
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x606
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x607
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x608
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.cache creating system inode with ino:0x609
Jan 20 08:58:46 np0005588920 ceph-mds[83715]: mds.0.4 creating_done
Jan 20 08:58:46 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Jan 20 08:58:46 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e49 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: daemon mds.cephfs.compute-2.jyxktq assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: Cluster is now healthy
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: daemon mds.cephfs.compute-2.jyxktq is now active in filesystem cephfs as rank 0
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rtofcx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rtofcx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e5 new map
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:47.570199+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 2 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:47 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Updating MDS map to version 5 from mon.1
Jan 20 08:58:47 np0005588920 ceph-mds[83715]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 20 08:58:47 np0005588920 ceph-mds[83715]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 20 08:58:47 np0005588920 ceph-mds[83715]: mds.0.4 recovery_done -- successful recovery!
Jan 20 08:58:47 np0005588920 ceph-mds[83715]: mds.0.4 active_start
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e6 new map
Jan 20 08:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:47.570199+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 2 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 20 08:58:48 np0005588920 radosgw[83324]: LDAP not started since no server URIs were provided in the configuration.
Jan 20 08:58:48 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-2-ktpnzt[83320]: 2026-01-20T13:58:48.420+0000 7f0b460dc940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 20 08:58:48 np0005588920 radosgw[83324]: framework: beast
Jan 20 08:58:48 np0005588920 radosgw[83324]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 20 08:58:48 np0005588920 radosgw[83324]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: starting handler: beast
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588920 radosgw[83324]: mgrc service_daemon_register rgw.24166 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.ktpnzt,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=8115d0e5-f46a-4d23-887b-99af6a666d4f,zone_name=default,zonegroup_id=1c9817d6-3061-4a20-aeb7-2a830f7cf40e,zonegroup_name=default}
Jan 20 08:58:48 np0005588920 ceph-mon[77148]: Deploying daemon mds.cephfs.compute-1.rtofcx on compute-1
Jan 20 08:58:48 np0005588920 ceph-mon[77148]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588920 ceph-mon[77148]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Jan 20 08:58:48 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Jan 20 08:58:50 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 20 08:58:50 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Jan 20 08:58:51 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e7 new map
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:51 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq Updating MDS map to version 7 from mon.1
Jan 20 08:58:51 np0005588920 ceph-mon[77148]: Deploying daemon haproxy.rgw.default.compute-0.nqkboe on compute-0
Jan 20 08:58:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e8 new map
Jan 20 08:58:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.d scrub starts
Jan 20 08:58:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.d scrub ok
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: Deploying daemon haproxy.rgw.default.compute-2.cuokcs on compute-2
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e9 new map
Jan 20 08:58:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:58:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 20 08:58:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:55.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 20 08:58:55 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 20 08:58:55 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 20 08:58:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Jan 20 08:58:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Jan 20 08:58:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:58:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:58:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:57.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.22304847 +0000 UTC m=+2.644962034 container create b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.20108863 +0000 UTC m=+2.623002274 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 20 08:58:57 np0005588920 systemd[1]: Started libpod-conmon-b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85.scope.
Jan 20 08:58:57 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.320326444 +0000 UTC m=+2.742240008 container init b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.332793499 +0000 UTC m=+2.754707063 container start b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.336224252 +0000 UTC m=+2.758137846 container attach b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 blissful_cray[84544]: 0 0
Jan 20 08:58:57 np0005588920 systemd[1]: libpod-b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85.scope: Deactivated successfully.
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.343321972 +0000 UTC m=+2.765235566 container died b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 systemd[1]: var-lib-containers-storage-overlay-8180638e13ed7e4739ac7e4dafe2ca7fbeb71615e7c16d1400ad1fcc7b1fda92-merged.mount: Deactivated successfully.
Jan 20 08:58:57 np0005588920 podman[84429]: 2026-01-20 13:58:57.393490371 +0000 UTC m=+2.815403925 container remove b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85 (image=quay.io/ceph/haproxy:2.3, name=blissful_cray)
Jan 20 08:58:57 np0005588920 systemd[1]: libpod-conmon-b018a20012ea4fbd821a7c59722ee100000836b5407086be73ea5a606db24e85.scope: Deactivated successfully.
Jan 20 08:58:57 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:57 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:57 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:57 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.d scrub starts
Jan 20 08:58:57 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.d scrub ok
Jan 20 08:58:57 np0005588920 systemd[1]: Reloading.
Jan 20 08:58:57 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:57 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:58 np0005588920 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.cuokcs for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:58 np0005588920 podman[84687]: 2026-01-20 13:58:58.474709469 +0000 UTC m=+0.060578889 container create c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:58:58 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d2750c2b0a0e3b8fd12ff199ed0b87441ae77163d5fee9f3f72c9fcae817670/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:58 np0005588920 podman[84687]: 2026-01-20 13:58:58.529609294 +0000 UTC m=+0.115478724 container init c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:58:58 np0005588920 podman[84687]: 2026-01-20 13:58:58.535125012 +0000 UTC m=+0.120994442 container start c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:58:58 np0005588920 podman[84687]: 2026-01-20 13:58:58.444454216 +0000 UTC m=+0.030323696 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 20 08:58:58 np0005588920 bash[84687]: c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54
Jan 20 08:58:58 np0005588920 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.cuokcs for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:58 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [NOTICE] 019/135858 (2) : New worker #1 (4) forked
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 08:58:58 np0005588920 ceph-mon[77148]: Deploying daemon keepalived.rgw.default.compute-0.gcjsxe on compute-0
Jan 20 08:58:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:58:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:58:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:58:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:58:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:58:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:58:59.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:01.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:01.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:02 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.a scrub starts
Jan 20 08:59:02 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.a scrub ok
Jan 20 08:59:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 08:59:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:03.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 08:59:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 20 08:59:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 20 08:59:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:03.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 08:59:04 np0005588920 ceph-mon[77148]: Deploying daemon keepalived.rgw.default.compute-2.dleeql on compute-2
Jan 20 08:59:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:05.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:05 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 20 08:59:05 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 20 08:59:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:05.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.710554262 +0000 UTC m=+3.367926545 container create f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, version=2.2.4, vendor=Red Hat, Inc.)
Jan 20 08:59:07 np0005588920 systemd[1]: Started libpod-conmon-f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670.scope.
Jan 20 08:59:07 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.773479973 +0000 UTC m=+3.430852266 container init f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, description=keepalived for Ceph, name=keepalived, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2023-02-22T09:23:20, distribution-scope=public, architecture=x86_64, io.buildah.version=1.28.2)
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.697763498 +0000 UTC m=+3.355135781 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.778782205 +0000 UTC m=+3.436154488 container start f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, version=2.2.4, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.781450937 +0000 UTC m=+3.438823220 container attach f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=keepalived for Ceph, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, distribution-scope=public, com.redhat.component=keepalived-container, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4)
Jan 20 08:59:07 np0005588920 vibrant_liskov[84959]: 0 0
Jan 20 08:59:07 np0005588920 systemd[1]: libpod-f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670.scope: Deactivated successfully.
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.783524413 +0000 UTC m=+3.440896696 container died f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 20 08:59:07 np0005588920 systemd[1]: var-lib-containers-storage-overlay-5e0b83a330d1959092138d717279278c808484c8184a65798a1c081d4afecd40-merged.mount: Deactivated successfully.
Jan 20 08:59:07 np0005588920 podman[84857]: 2026-01-20 13:59:07.816146279 +0000 UTC m=+3.473518562 container remove f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670 (image=quay.io/ceph/keepalived:2.2.4, name=vibrant_liskov, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, name=keepalived, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2, version=2.2.4, build-date=2023-02-22T09:23:20, release=1793, com.redhat.component=keepalived-container, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 20 08:59:07 np0005588920 systemd[1]: libpod-conmon-f716cd2ac11cdcf8c9289a781398ca81289f270edf30f1adc93c2eafec95a670.scope: Deactivated successfully.
Jan 20 08:59:07 np0005588920 systemd[1]: Reloading.
Jan 20 08:59:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:07.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:59:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:59:08 np0005588920 systemd[1]: Reloading.
Jan 20 08:59:08 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:59:08 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:59:08 np0005588920 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.dleeql for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:59:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 20 08:59:08 np0005588920 podman[85105]: 2026-01-20 13:59:08.756368968 +0000 UTC m=+0.094598993 container create 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, release=1793, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.openshift.expose-services=, version=2.2.4, distribution-scope=public, description=keepalived for Ceph)
Jan 20 08:59:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:08 np0005588920 podman[85105]: 2026-01-20 13:59:08.690296852 +0000 UTC m=+0.028526887 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 08:59:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da163f20f86839d019f57876cc1fcfeca512d1d6ec19fb725d5dbfa41da48983/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:59:08 np0005588920 podman[85105]: 2026-01-20 13:59:08.85837762 +0000 UTC m=+0.196607705 container init 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, release=1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, name=keepalived, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2)
Jan 20 08:59:08 np0005588920 podman[85105]: 2026-01-20 13:59:08.863137398 +0000 UTC m=+0.201367433 container start 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, io.buildah.version=1.28.2, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, vcs-type=git, release=1793, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.openshift.expose-services=, description=keepalived for Ceph, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 20 08:59:08 np0005588920 bash[85105]: 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482
Jan 20 08:59:08 np0005588920 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.dleeql for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Starting VRRP child process, pid=4
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: Startup complete
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: (VI_0) Entering BACKUP STATE (init)
Jan 20 08:59:08 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:08 2026: VRRP_Script(check_backend) succeeded
Jan 20 08:59:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:09.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:09 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 20 08:59:09 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:09.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:10 np0005588920 podman[85403]: 2026-01-20 13:59:10.60911017 +0000 UTC m=+0.075514671 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:59:10 np0005588920 podman[85403]: 2026-01-20 13:59:10.725604711 +0000 UTC m=+0.192009212 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:59:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 20 08:59:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:11.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:11 np0005588920 podman[85558]: 2026-01-20 13:59:11.538984491 +0000 UTC m=+0.089436905 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:59:11 np0005588920 podman[85558]: 2026-01-20 13:59:11.554926429 +0000 UTC m=+0.105378793 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:59:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 20 08:59:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:11 np0005588920 podman[85624]: 2026-01-20 13:59:11.8723554 +0000 UTC m=+0.047475107 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, description=keepalived for Ceph, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 20 08:59:11 np0005588920 podman[85624]: 2026-01-20 13:59:11.885587066 +0000 UTC m=+0.060706763 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 20 08:59:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:11.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:12 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:12 2026: (VI_0) Entering MASTER STATE
Jan 20 08:59:12 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:12 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 20 08:59:12 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 13:59:12 2026: (VI_0) Entering BACKUP STATE
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 20 08:59:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:13.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:59:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:59:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:13.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:15.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 20 08:59:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 20 08:59:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:15.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:16 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Jan 20 08:59:16 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:17.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:17.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.15( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.16( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.17( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.16( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.13( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.2( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.11( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.3( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.f( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.a( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.a( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.9( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.e( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 20 08:59:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.d( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.c( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.8( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.b( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.6( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.5( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.19( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.1f( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[8.1c( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 58 pg[11.3( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 20 08:59:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.16( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.a( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.9( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.d( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.e( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.3( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 20 08:59:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.11( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.2( v 44'4 (0'0,44'4] local-lis/les=58/59 n=1 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.f( v 44'4 lc 0'0 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.3( v 44'4 (0'0,44'4] local-lis/les=58/59 n=1 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.13( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.16( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.8( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.b( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.1c( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.6( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.19( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.15( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.1f( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.a( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.5( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[11.17( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[8.c( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [2] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.11( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.3( v 57'51 lc 48'39 (0'0,57'51] local-lis/les=58/59 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=57'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 59 pg[10.10( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58) [2] r=0 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:19.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 20 08:59:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 20 08:59:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:19.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: Reconfiguring mgr.compute-0.wookjv (monmap changed)...
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.wookjv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:59:20 np0005588920 ceph-mon[77148]: Reconfiguring daemon mgr.compute-0.wookjv on compute-0
Jan 20 08:59:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:21.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Jan 20 08:59:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Jan 20 08:59:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:21.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 20 08:59:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:23.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:23.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: Reconfiguring osd.0 (monmap changed)...
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: Reconfiguring daemon osd.0 on compute-0
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 20 08:59:24 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 20 08:59:24 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring osd.1 (monmap changed)...
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring daemon osd.1 on compute-1
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:24 np0005588920 ceph-mon[77148]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 20 08:59:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:25.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 20 08:59:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 20 08:59:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 20 08:59:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.558529481 +0000 UTC m=+0.067558557 container create edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:59:26 np0005588920 systemd[1]: Started libpod-conmon-edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6.scope.
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.531687789 +0000 UTC m=+0.040716925 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:59:26 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.662070063 +0000 UTC m=+0.171099209 container init edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.674667312 +0000 UTC m=+0.183696368 container start edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.678444483 +0000 UTC m=+0.187473569 container attach edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 20 08:59:26 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Jan 20 08:59:26 np0005588920 affectionate_kapitsa[85976]: 167 167
Jan 20 08:59:26 np0005588920 systemd[1]: libpod-edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6.scope: Deactivated successfully.
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.682965525 +0000 UTC m=+0.191994601 container died edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:59:26 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Jan 20 08:59:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay-8912080ba0d73ecfeec82b5edd7970ea4729f09f36337ff05935b9f5bb39b745-merged.mount: Deactivated successfully.
Jan 20 08:59:26 np0005588920 podman[85960]: 2026-01-20 13:59:26.737069769 +0000 UTC m=+0.246098845 container remove edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kapitsa, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:59:26 np0005588920 systemd[1]: libpod-conmon-edf4482c5e8bf6ed8e6fe69c84f38245db916625b0acf8d4d24d7a6c0220d8f6.scope: Deactivated successfully.
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: Reconfiguring mgr.compute-2.gunjko (monmap changed)...
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:59:26 np0005588920 ceph-mon[77148]: Reconfiguring daemon mgr.compute-2.gunjko on compute-2
Jan 20 08:59:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:27.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.468888807 +0000 UTC m=+0.055878443 container create 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 08:59:27 np0005588920 systemd[1]: Started libpod-conmon-65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd.scope.
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.448002185 +0000 UTC m=+0.034991831 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:59:27 np0005588920 systemd[1]: Started libcrun container.
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.574850673 +0000 UTC m=+0.161840329 container init 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.620539521 +0000 UTC m=+0.207529147 container start 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.625571487 +0000 UTC m=+0.212561113 container attach 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:59:27 np0005588920 sharp_brattain[86127]: 167 167
Jan 20 08:59:27 np0005588920 systemd[1]: libpod-65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd.scope: Deactivated successfully.
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.627457467 +0000 UTC m=+0.214447083 container died 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:59:27 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c2b758ce43feed1c1e5040dd24bc0a15e647bb51e9a73038eb6b0d0bdeff60ff-merged.mount: Deactivated successfully.
Jan 20 08:59:27 np0005588920 podman[86112]: 2026-01-20 13:59:27.672357884 +0000 UTC m=+0.259347520 container remove 65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_brattain, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Jan 20 08:59:27 np0005588920 systemd[1]: libpod-conmon-65d1dd727ba817b6d3b580d3b14070952a08c5b2831a00d69c04741c75b498cd.scope: Deactivated successfully.
Jan 20 08:59:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:27.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 20 08:59:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 61 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:28 np0005588920 podman[86318]: 2026-01-20 13:59:28.544120153 +0000 UTC m=+0.064117824 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 20 08:59:28 np0005588920 podman[86318]: 2026-01-20 13:59:28.644664065 +0000 UTC m=+0.164661766 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:59:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 20 08:59:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.17( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.3( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.7( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.13( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:29 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 62 pg[9.1b( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=62) [2]/[0] r=-1 lpr=62 pi=[54,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:29.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:29 np0005588920 podman[86470]: 2026-01-20 13:59:29.388357812 +0000 UTC m=+0.099465894 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:59:29 np0005588920 podman[86470]: 2026-01-20 13:59:29.444867271 +0000 UTC m=+0.155975383 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 08:59:29 np0005588920 podman[86585]: 2026-01-20 13:59:29.684052129 +0000 UTC m=+0.063624361 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, release=1793, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-type=git, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.openshift.expose-services=, distribution-scope=public)
Jan 20 08:59:29 np0005588920 podman[86585]: 2026-01-20 13:59:29.700295975 +0000 UTC m=+0.079868187 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, release=1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., version=2.2.4)
Jan 20 08:59:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:29.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Jan 20 08:59:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Jan 20 08:59:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 20 08:59:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:59:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:59:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.17( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.17( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.3( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.b( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.3( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.b( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.7( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.7( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.13( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.13( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 64 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:31.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:31.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=65) [2] r=0 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=65) [2] r=0 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=65) [2] r=0 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=65) [2] r=0 lpr=65 pi=[54,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.13( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.7( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.3( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 65 pg[9.17( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=62/54 les/c/f=63/55/0 sis=64) [2] r=0 lpr=64 pi=[54,64)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 20 08:59:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:32 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 66 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=66) [2]/[0] r=-1 lpr=66 pi=[54,66)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 20 08:59:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:33.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 20 08:59:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:33.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 20 08:59:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 20 08:59:34 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Jan 20 08:59:34 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok
Jan 20 08:59:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:35.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:35 np0005588920 systemd[1]: session-19.scope: Deactivated successfully.
Jan 20 08:59:35 np0005588920 systemd[1]: session-19.scope: Consumed 8.789s CPU time.
Jan 20 08:59:35 np0005588920 systemd-logind[783]: Session 19 logged out. Waiting for processes to exit.
Jan 20 08:59:35 np0005588920 systemd-logind[783]: Removed session 19.
Jan 20 08:59:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:35.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.5( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 68 pg[9.5( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 20 08:59:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 69 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 69 pg[9.5( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 69 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 69 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=66/54 les/c/f=67/55/0 sis=68) [2] r=0 lpr=68 pi=[54,68)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:37.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:37.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 20 08:59:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 20 08:59:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:39.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:40 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1f deep-scrub starts
Jan 20 08:59:40 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1f deep-scrub ok
Jan 20 08:59:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:41.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:41.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 20 08:59:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 20 08:59:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 20 08:59:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 20 08:59:43 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 73 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:43 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 73 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=73) [2] r=0 lpr=73 pi=[54,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:43.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:43.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 20 08:59:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 20 08:59:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 20 08:59:44 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=74) [2]/[0] r=-1 lpr=74 pi=[54,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:44 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 74 pg[9.8( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=74) [2]/[0] r=-1 lpr=74 pi=[54,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:44 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=74) [2]/[0] r=-1 lpr=74 pi=[54,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:44 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 74 pg[9.18( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=74) [2]/[0] r=-1 lpr=74 pi=[54,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 20 08:59:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 20 08:59:45 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 75 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:45 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 75 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=75) [2] r=0 lpr=75 pi=[54,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:45.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 20 08:59:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=76) [2]/[0] r=-1 lpr=76 pi=[54,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.9( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=76) [2]/[0] r=-1 lpr=76 pi=[54,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=76) [2]/[0] r=-1 lpr=76 pi=[54,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.18( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.19( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=76) [2]/[0] r=-1 lpr=76 pi=[54,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.18( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.8( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:46 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 76 pg[9.8( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 20 08:59:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 77 pg[9.18( v 51'1000 (0'0,51'1000] local-lis/les=76/77 n=5 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 77 pg[9.8( v 51'1000 (0'0,51'1000] local-lis/les=76/77 n=6 ec=54/45 lis/c=74/54 les/c/f=75/55/0 sis=76) [2] r=0 lpr=76 pi=[54,76)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:47.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 78 pg[9.9( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 78 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 78 pg[9.9( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 78 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:47.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 20 08:59:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 20 08:59:48 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 79 pg[9.9( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=6 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:48 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 79 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=5 ec=54/45 lis/c=76/54 les/c/f=77/55/0 sis=78) [2] r=0 lpr=78 pi=[54,78)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:49.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:49.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 20 08:59:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 20 08:59:51 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Jan 20 08:59:51 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Jan 20 08:59:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:51.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:51.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:52 np0005588920 systemd-logind[783]: New session 33 of user zuul.
Jan 20 08:59:52 np0005588920 systemd[1]: Started Session 33 of User zuul.
Jan 20 08:59:52 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Jan 20 08:59:52 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Jan 20 08:59:53 np0005588920 python3.9[86885]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:59:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Jan 20 08:59:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Jan 20 08:59:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:53.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:54 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Jan 20 08:59:54 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Jan 20 08:59:55 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 20 08:59:55 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 20 08:59:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:55.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:55.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:56 np0005588920 python3.9[87100]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:59:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Jan 20 08:59:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Jan 20 08:59:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:57.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:57.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 20 08:59:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 20 08:59:59 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 20 08:59:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:59.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 08:59:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:59.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 20 09:00:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:00:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 20 09:00:01 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 20 09:00:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:01.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:01.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 20 09:00:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 84 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84 pruub=14.937047005s) [1] r=-1 lpr=84 pi=[68,84)/1 crt=51'1000 mlcod 0'0 active pruub 113.611114502s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 84 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84 pruub=14.936979294s) [1] r=-1 lpr=84 pi=[68,84)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 113.611114502s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 84 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84 pruub=14.938656807s) [1] r=-1 lpr=84 pi=[68,84)/1 crt=51'1000 mlcod 0'0 active pruub 113.613380432s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 84 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84 pruub=14.938628197s) [1] r=-1 lpr=84 pi=[68,84)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 113.613380432s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 85 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 85 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 85 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 85 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=6 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 20 09:00:03 np0005588920 systemd[1]: session-33.scope: Deactivated successfully.
Jan 20 09:00:03 np0005588920 systemd[1]: session-33.scope: Consumed 8.400s CPU time.
Jan 20 09:00:03 np0005588920 systemd-logind[783]: Session 33 logged out. Waiting for processes to exit.
Jan 20 09:00:03 np0005588920 systemd-logind[783]: Removed session 33.
Jan 20 09:00:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 20 09:00:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 20 09:00:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:03.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 20 09:00:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:03.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:03 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 86 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] async=[1] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:03 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 86 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=6 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] async=[1] r=0 lpr=85 pi=[68,85)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 20 09:00:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 20 09:00:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 87 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=6 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87 pruub=15.004650116s) [1] async=[1] r=-1 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 51'1000 active pruub 116.468246460s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 87 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=6 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87 pruub=15.004433632s) [1] r=-1 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 116.468246460s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 87 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=5 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87 pruub=14.999007225s) [1] async=[1] r=-1 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 51'1000 active pruub 116.463714600s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:04 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 87 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=85/86 n=5 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87 pruub=14.998891830s) [1] r=-1 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 116.463714600s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 20 09:00:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:05.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:05.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 88 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88 pruub=14.191205025s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=51'1000 mlcod 0'0 active pruub 116.665122986s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 88 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88 pruub=14.191046715s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 116.665122986s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 88 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88 pruub=14.192161560s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=51'1000 mlcod 0'0 active pruub 116.667434692s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:05 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 88 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88 pruub=14.192122459s) [1] r=-1 lpr=88 pi=[64,88)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 116.667434692s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 20 09:00:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 20 09:00:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 89 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 89 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 89 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:06 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 89 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=6 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:07.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:07.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 20 09:00:08 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 90 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] async=[1] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:08 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 90 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=6 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] async=[1] r=0 lpr=89 pi=[64,89)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.e scrub starts
Jan 20 09:00:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.e scrub ok
Jan 20 09:00:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 20 09:00:09 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 91 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=5 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91 pruub=15.182332039s) [1] async=[1] r=-1 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 51'1000 active pruub 120.972137451s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:09 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 91 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=5 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91 pruub=15.182240486s) [1] r=-1 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 120.972137451s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:09 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 91 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=6 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91 pruub=15.185222626s) [1] async=[1] r=-1 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 51'1000 active pruub 120.975639343s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:09 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 91 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=89/90 n=6 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91 pruub=15.185138702s) [1] r=-1 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 120.975639343s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:09.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:09.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 20 09:00:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 20 09:00:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 20 09:00:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:11.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:11.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 20 09:00:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 20 09:00:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 20 09:00:12 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 20 09:00:12 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 20 09:00:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 20 09:00:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 20 09:00:13 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 20 09:00:13 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 20 09:00:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:13.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:13.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 20 09:00:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 20 09:00:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 20 09:00:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 20 09:00:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 20 09:00:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 20 09:00:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 20 09:00:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:15.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:15.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 20 09:00:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 20 09:00:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 20 09:00:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:17.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 20 09:00:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:17.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:18 np0005588920 systemd-logind[783]: New session 34 of user zuul.
Jan 20 09:00:18 np0005588920 systemd[1]: Started Session 34 of User zuul.
Jan 20 09:00:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Jan 20 09:00:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Jan 20 09:00:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 20 09:00:19 np0005588920 python3.9[87374]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 20 09:00:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:19.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:19.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:20 np0005588920 python3.9[87548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:20 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 20 09:00:20 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 20 09:00:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 20 09:00:21 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 20 09:00:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1 deep-scrub starts
Jan 20 09:00:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 4.1 deep-scrub ok
Jan 20 09:00:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:21.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:21 np0005588920 python3.9[87705]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:00:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:21.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 20 09:00:22 np0005588920 python3.9[87858]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:00:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 20 09:00:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 20 09:00:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:23.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:23 np0005588920 python3.9[88013]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:00:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:23.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 20 09:00:24 np0005588920 python3.9[88165]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:00:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 20 09:00:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 20 09:00:25 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 104 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=104 pruub=15.311692238s) [1] r=-1 lpr=104 pi=[68,104)/1 crt=51'1000 mlcod 0'0 active pruub 137.613800049s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:25 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 104 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=104 pruub=15.311620712s) [1] r=-1 lpr=104 pi=[68,104)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 137.613800049s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 20 09:00:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 20 09:00:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:25.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:25 np0005588920 python3.9[88316]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:00:25 np0005588920 network[88333]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:00:25 np0005588920 network[88334]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:00:25 np0005588920 network[88335]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:00:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:25.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 20 09:00:26 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 105 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=105) [1]/[2] r=0 lpr=105 pi=[68,105)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:26 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 105 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=68/69 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=105) [1]/[2] r=0 lpr=105 pi=[68,105)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 20 09:00:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 20 09:00:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 20 09:00:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:27.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:27.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 106 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=106) [2] r=0 lpr=106 pi=[70,106)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 106 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=105/106 n=5 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=105) [1]/[2] async=[1] r=0 lpr=105 pi=[68,105)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 20 09:00:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 107 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=105/106 n=5 ec=54/45 lis/c=105/68 les/c/f=106/69/0 sis=107 pruub=15.328523636s) [1] async=[1] r=-1 lpr=107 pi=[68,107)/1 crt=51'1000 mlcod 51'1000 active pruub 140.721298218s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 107 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[70,107)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 107 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=105/106 n=5 ec=54/45 lis/c=105/68 les/c/f=106/69/0 sis=107 pruub=15.328403473s) [1] r=-1 lpr=107 pi=[68,107)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 140.721298218s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:28 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 107 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[70,107)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:29.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 20 09:00:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 20 09:00:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:29.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:30 np0005588920 python3.9[88646]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:00:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 20 09:00:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 20 09:00:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 109 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=107/70 les/c/f=108/71/0 sis=109) [2] r=0 lpr=109 pi=[70,109)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:30 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 109 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=107/70 les/c/f=108/71/0 sis=109) [2] r=0 lpr=109 pi=[70,109)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:31 np0005588920 python3.9[88798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 20 09:00:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 20 09:00:31 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 110 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=109/110 n=5 ec=54/45 lis/c=107/70 les/c/f=108/71/0 sis=109) [2] r=0 lpr=109 pi=[70,109)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:31.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:32 np0005588920 python3.9[88952]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 20 09:00:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:00:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:00:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:34 np0005588920 python3.9[89111]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:00:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 20 09:00:34 np0005588920 python3.9[89195]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:00:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 20 09:00:35 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 111 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=5 ec=54/45 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=9.696326256s) [0] r=-1 lpr=111 pi=[78,111)/1 crt=51'1000 mlcod 0'0 active pruub 141.444290161s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:35 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 111 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=5 ec=54/45 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=9.696252823s) [0] r=-1 lpr=111 pi=[78,111)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 141.444290161s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 20 09:00:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 20 09:00:35 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 20 09:00:35 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 20 09:00:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:35.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:00:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:35.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:00:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 20 09:00:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 112 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=5 ec=54/45 lis/c=78/78 les/c/f=79/79/0 sis=112) [0]/[2] r=0 lpr=112 pi=[78,112)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:36 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 112 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=78/79 n=5 ec=54/45 lis/c=78/78 les/c/f=79/79/0 sis=112) [0]/[2] r=0 lpr=112 pi=[78,112)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 20 09:00:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 113 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=113 pruub=14.888893127s) [0] r=-1 lpr=113 pi=[64,113)/1 crt=51'1000 mlcod 0'0 active pruub 148.668167114s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 113 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=113 pruub=14.888405800s) [0] r=-1 lpr=113 pi=[64,113)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 148.668167114s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 113 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=112/113 n=5 ec=54/45 lis/c=78/78 les/c/f=79/79/0 sis=112) [0]/[2] async=[0] r=0 lpr=112 pi=[78,112)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 20 09:00:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 20 09:00:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:37.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 114 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] r=0 lpr=114 pi=[64,114)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 114 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=112/113 n=5 ec=54/45 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=15.307610512s) [0] async=[0] r=-1 lpr=114 pi=[78,114)/1 crt=51'1000 mlcod 51'1000 active pruub 149.785491943s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 114 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=64/65 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] r=0 lpr=114 pi=[64,114)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:37 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 114 pg[9.19( v 51'1000 (0'0,51'1000] local-lis/les=112/113 n=5 ec=54/45 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=15.307285309s) [0] r=-1 lpr=114 pi=[78,114)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 149.785491943s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:37.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 20 09:00:38 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 115 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=114/115 n=5 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=114) [0]/[2] async=[0] r=0 lpr=114 pi=[64,114)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:39 np0005588920 podman[89431]: 2026-01-20 14:00:39.300257127 +0000 UTC m=+0.077533401 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:00:39 np0005588920 podman[89431]: 2026-01-20 14:00:39.389873801 +0000 UTC m=+0.167150075 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 09:00:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:39.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 20 09:00:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:39 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 116 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=114/115 n=5 ec=54/45 lis/c=114/64 les/c/f=115/65/0 sis=116 pruub=14.997964859s) [0] async=[0] r=-1 lpr=116 pi=[64,116)/1 crt=51'1000 mlcod 51'1000 active pruub 151.497772217s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:39 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 116 pg[9.1b( v 51'1000 (0'0,51'1000] local-lis/les=114/115 n=5 ec=54/45 lis/c=114/64 les/c/f=115/65/0 sis=116 pruub=14.997794151s) [0] r=-1 lpr=116 pi=[64,116)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 151.497772217s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:40 np0005588920 podman[89587]: 2026-01-20 14:00:40.291872709 +0000 UTC m=+0.096980031 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:00:40 np0005588920 podman[89587]: 2026-01-20 14:00:40.302842262 +0000 UTC m=+0.107949534 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:00:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:40 np0005588920 podman[89650]: 2026-01-20 14:00:40.525796046 +0000 UTC m=+0.057168098 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, version=2.2.4, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container)
Jan 20 09:00:40 np0005588920 podman[89650]: 2026-01-20 14:00:40.539557443 +0000 UTC m=+0.070929415 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, release=1793, io.openshift.expose-services=, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.28.2)
Jan 20 09:00:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:00:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:41.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:41 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1d deep-scrub starts
Jan 20 09:00:41 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 3.1d deep-scrub ok
Jan 20 09:00:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:41.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:43.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 20 09:00:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 20 09:00:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:45.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:45.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.665821) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647666078, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7048, "num_deletes": 256, "total_data_size": 12955756, "memory_usage": 13223448, "flush_reason": "Manual Compaction"}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647777753, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7725991, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 254, "largest_seqno": 7053, "table_properties": {"data_size": 7698461, "index_size": 18007, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8453, "raw_key_size": 79170, "raw_average_key_size": 23, "raw_value_size": 7632919, "raw_average_value_size": 2267, "num_data_blocks": 797, "num_entries": 3366, "num_filter_entries": 3366, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 1768917472, "file_creation_time": 1768917647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 112001 microseconds, and 32470 cpu microseconds.
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.777846) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7725991 bytes OK
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.777872) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.780257) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.780284) EVENT_LOG_v1 {"time_micros": 1768917647780277, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.780308) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 12918668, prev total WAL file size 12918668, number of live WAL files 2.
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.784815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7544KB) 8(1648B)]
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647784964, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7727639, "oldest_snapshot_seqno": -1}
Jan 20 09:00:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:47.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3113 keys, 7722165 bytes, temperature: kUnknown
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647898538, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7722165, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7695350, "index_size": 17937, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7813, "raw_key_size": 74954, "raw_average_key_size": 24, "raw_value_size": 7632977, "raw_average_value_size": 2451, "num_data_blocks": 796, "num_entries": 3113, "num_filter_entries": 3113, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768917647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.899037) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7722165 bytes
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.900869) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 68.0 rd, 67.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.4, 0.0 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3371, records dropped: 258 output_compression: NoCompression
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.900902) EVENT_LOG_v1 {"time_micros": 1768917647900888, "job": 4, "event": "compaction_finished", "compaction_time_micros": 113693, "compaction_time_cpu_micros": 34627, "output_level": 6, "num_output_files": 1, "total_output_size": 7722165, "num_input_records": 3371, "num_output_records": 3113, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647907231, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647907363, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:00:47.784638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:00:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 119 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=119) [2] r=0 lpr=119 pi=[87,119)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 20 09:00:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 120 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=120) [2]/[1] r=-1 lpr=120 pi=[87,120)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:47 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 120 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=120) [2]/[1] r=-1 lpr=120 pi=[87,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:00:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:47.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:00:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 20 09:00:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 20 09:00:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:49.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 20 09:00:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 20 09:00:50 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 122 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=120/87 les/c/f=121/88/0 sis=122) [2] r=0 lpr=122 pi=[87,122)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:50 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 122 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=120/87 les/c/f=121/88/0 sis=122) [2] r=0 lpr=122 pi=[87,122)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 20 09:00:51 np0005588920 ceph-osd[79820]: osd.2 pg_epoch: 123 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=122/123 n=5 ec=54/45 lis/c=120/87 les/c/f=121/88/0 sis=122) [2] r=0 lpr=122 pi=[87,122)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 20 09:00:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:51.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 20 09:00:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 09:00:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 09:00:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 20 09:00:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 20 09:00:53 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 20 09:00:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:00:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:54.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:00:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 20 09:00:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 20 09:00:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:55.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:56.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 20 09:00:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:57.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:58.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:59 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 20 09:00:59 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 20 09:00:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:00:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:59.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:00.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:00 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 20 09:01:00 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 20 09:01:01 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Jan 20 09:01:01 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Jan 20 09:01:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:01.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:02.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:03.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:04.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 20 09:01:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 20 09:01:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:05.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:06.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:07.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:08.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 20 09:01:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 20 09:01:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:09.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:10.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 20 09:01:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 20 09:01:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:11.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:12.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:13 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 20 09:01:13 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 20 09:01:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:13.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 20 09:01:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:14.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 20 09:01:14 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Jan 20 09:01:14 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Jan 20 09:01:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:15.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:16.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:16 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 20 09:01:16 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 20 09:01:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:17.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:18.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 20 09:01:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 20 09:01:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 20 09:01:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 20 09:01:19 np0005588920 python3.9[90226]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:01:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:19.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:20.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 20 09:01:21 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 20 09:01:21 np0005588920 python3.9[90513]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 20 09:01:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:21.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:22.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:22 np0005588920 python3.9[90666]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 20 09:01:22 np0005588920 python3.9[90818]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:01:23 np0005588920 python3.9[90971]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 20 09:01:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:23.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:24.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 20 09:01:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 20 09:01:25 np0005588920 python3.9[91124]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:25.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:26.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:26 np0005588920 python3.9[91276]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:26 np0005588920 python3.9[91354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:01:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:27.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:28 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 20 09:01:28 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 20 09:01:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:28.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:28 np0005588920 python3.9[91509]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:29 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Jan 20 09:01:29 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Jan 20 09:01:29 np0005588920 python3.9[91664]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 20 09:01:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:29.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 20 09:01:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:30.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 20 09:01:30 np0005588920 python3.9[91817]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 20 09:01:31 np0005588920 python3.9[92021]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:01:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:31.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:32 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.2 deep-scrub starts
Jan 20 09:01:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:32 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.2 deep-scrub ok
Jan 20 09:01:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:32 np0005588920 python3.9[92173]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 20 09:01:33 np0005588920 python3.9[92326]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:01:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:33.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:33 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.16 deep-scrub starts
Jan 20 09:01:33 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.16 deep-scrub ok
Jan 20 09:01:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:34.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:35 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 20 09:01:35 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 20 09:01:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:35.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:01:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:36.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:01:36 np0005588920 python3.9[92480]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:37 np0005588920 python3.9[92632]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:37 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 20 09:01:37 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 20 09:01:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:37 np0005588920 python3.9[92711]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:37.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:38.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:38 np0005588920 python3.9[92863]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:38 np0005588920 python3.9[92941]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:39.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:40.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:40 np0005588920 python3.9[93094]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:01:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:41.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:42 np0005588920 python3.9[93246]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:43 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 20 09:01:43 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 20 09:01:43 np0005588920 python3.9[93399]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 20 09:01:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:43.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:44 np0005588920 python3.9[93549]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:45 np0005588920 python3.9[93702]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:45 np0005588920 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 20 09:01:45 np0005588920 systemd[1]: tuned.service: Deactivated successfully.
Jan 20 09:01:45 np0005588920 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 20 09:01:45 np0005588920 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 09:01:45 np0005588920 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 09:01:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:45.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:45 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 20 09:01:46 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 20 09:01:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:46.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:46 np0005588920 python3.9[93863]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 20 09:01:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:47 np0005588920 systemd[72686]: Created slice User Background Tasks Slice.
Jan 20 09:01:47 np0005588920 systemd[72686]: Starting Cleanup of User's Temporary Files and Directories...
Jan 20 09:01:47 np0005588920 systemd[72686]: Finished Cleanup of User's Temporary Files and Directories.
Jan 20 09:01:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:47.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:48.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:49 np0005588920 podman[94063]: 2026-01-20 14:01:49.008067743 +0000 UTC m=+0.089347312 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 09:01:49 np0005588920 podman[94063]: 2026-01-20 14:01:49.141833112 +0000 UTC m=+0.223112711 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:01:49 np0005588920 podman[94214]: 2026-01-20 14:01:49.693062436 +0000 UTC m=+0.067004909 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:01:49 np0005588920 podman[94214]: 2026-01-20 14:01:49.703731829 +0000 UTC m=+0.077674312 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:01:49 np0005588920 podman[94277]: 2026-01-20 14:01:49.910661009 +0000 UTC m=+0.050240344 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.expose-services=, name=keepalived, vendor=Red Hat, Inc., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 20 09:01:49 np0005588920 podman[94277]: 2026-01-20 14:01:49.955638912 +0000 UTC m=+0.095218257 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, release=1793, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4)
Jan 20 09:01:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:49.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:50.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 20 09:01:50 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 20 09:01:50 np0005588920 python3.9[94581]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:01:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:01:51 np0005588920 python3.9[94772]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:51.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:52.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:52 np0005588920 systemd[1]: session-34.scope: Deactivated successfully.
Jan 20 09:01:52 np0005588920 systemd[1]: session-34.scope: Consumed 1min 5.739s CPU time.
Jan 20 09:01:52 np0005588920 systemd-logind[783]: Session 34 logged out. Waiting for processes to exit.
Jan 20 09:01:52 np0005588920 systemd-logind[783]: Removed session 34.
Jan 20 09:01:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:53.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:54.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:01:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:55.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:01:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:56.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:58.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:01:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:58.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:58 np0005588920 systemd-logind[783]: New session 35 of user zuul.
Jan 20 09:01:58 np0005588920 systemd[1]: Started Session 35 of User zuul.
Jan 20 09:01:58 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 20 09:01:58 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 20 09:01:59 np0005588920 python3.9[95006]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:01:59 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 20 09:01:59 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 20 09:02:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:00.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:00.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:01 np0005588920 python3.9[95163]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 20 09:02:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:02.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:02.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:02 np0005588920 python3.9[95316]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:03 np0005588920 python3.9[95401]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 09:02:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 20 09:02:03 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 20 09:02:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:04.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:04.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 20 09:02:04 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 20 09:02:05 np0005588920 python3.9[95555]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:06.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:06.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:07 np0005588920 python3.9[95709]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:02:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:08.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:08.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:09 np0005588920 python3.9[95863]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:10.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:10.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 20 09:02:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 20 09:02:11 np0005588920 python3.9[96015]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 20 09:02:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:12.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:12 np0005588920 python3.9[96216]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:14.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:14.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:14 np0005588920 python3.9[96375]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:14 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 20 09:02:14 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 20 09:02:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:16.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:16.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:16 np0005588920 python3.9[96529]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:18.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:18.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:18 np0005588920 python3.9[96817]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 20 09:02:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 20 09:02:18 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 20 09:02:19 np0005588920 python3.9[96968]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:02:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:20.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:20.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:20 np0005588920 python3.9[97122]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:22.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:22.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:22 np0005588920 python3.9[97276]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:24.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:24.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:24 np0005588920 python3.9[97430]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:02:25 np0005588920 python3.9[97585]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 20 09:02:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 20 09:02:25 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 20 09:02:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:26.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:26.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:27 np0005588920 systemd[1]: session-35.scope: Deactivated successfully.
Jan 20 09:02:27 np0005588920 systemd[1]: session-35.scope: Consumed 19.103s CPU time.
Jan 20 09:02:27 np0005588920 systemd-logind[783]: Session 35 logged out. Waiting for processes to exit.
Jan 20 09:02:27 np0005588920 systemd-logind[783]: Removed session 35.
Jan 20 09:02:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:28.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:28.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:30.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 20 09:02:30 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 20 09:02:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:32.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:32.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:32 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 20 09:02:32 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 20 09:02:34 np0005588920 systemd-logind[783]: New session 36 of user zuul.
Jan 20 09:02:34 np0005588920 systemd[1]: Started Session 36 of User zuul.
Jan 20 09:02:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:34.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:34.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:35 np0005588920 python3.9[97819]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:36.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:36 np0005588920 python3.9[97974]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:36 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 20 09:02:36 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 20 09:02:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:38 np0005588920 python3.9[98168]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:38.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:38.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:38 np0005588920 systemd[1]: session-36.scope: Deactivated successfully.
Jan 20 09:02:38 np0005588920 systemd[1]: session-36.scope: Consumed 2.657s CPU time.
Jan 20 09:02:38 np0005588920 systemd-logind[783]: Session 36 logged out. Waiting for processes to exit.
Jan 20 09:02:38 np0005588920 systemd-logind[783]: Removed session 36.
Jan 20 09:02:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:40.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:40.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:40 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 20 09:02:40 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 20 09:02:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:42.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:42.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:44 np0005588920 systemd-logind[783]: New session 37 of user zuul.
Jan 20 09:02:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:44.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:44 np0005588920 systemd[1]: Started Session 37 of User zuul.
Jan 20 09:02:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:44.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:45 np0005588920 python3.9[98351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 20 09:02:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:46.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 20 09:02:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:02:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:46.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:02:46 np0005588920 python3.9[98505]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:47 np0005588920 python3.9[98662]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:48.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:02:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:48.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:02:48 np0005588920 python3.9[98746]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:48 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 20 09:02:48 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 20 09:02:49 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.b scrub starts
Jan 20 09:02:49 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.b scrub ok
Jan 20 09:02:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:50.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:50.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:50 np0005588920 python3.9[98900]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:52.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:52.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:52 np0005588920 python3.9[99146]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:02:52 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 20 09:02:52 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 20 09:02:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:54.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:54.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:54 np0005588920 python3.9[99299]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:54 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 20 09:02:54 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 20 09:02:55 np0005588920 python3.9[99465]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:02:56 np0005588920 python3.9[99544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:02:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:56.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:56.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Jan 20 09:02:56 np0005588920 python3.9[99696]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:02:56 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Jan 20 09:02:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:57 np0005588920 python3.9[99775]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:58.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:02:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:58.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:58 np0005588920 python3.9[100064]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:58 np0005588920 podman[100097]: 2026-01-20 14:02:58.39697843 +0000 UTC m=+0.061077711 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:02:58 np0005588920 podman[100097]: 2026-01-20 14:02:58.51360307 +0000 UTC m=+0.177702351 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 20 09:02:58 np0005588920 python3.9[100350]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:59 np0005588920 podman[100403]: 2026-01-20 14:02:59.065918646 +0000 UTC m=+0.062437309 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:02:59 np0005588920 podman[100403]: 2026-01-20 14:02:59.07443057 +0000 UTC m=+0.070949203 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:02:59 np0005588920 podman[100542]: 2026-01-20 14:02:59.284381213 +0000 UTC m=+0.052795983 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, name=keepalived, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 20 09:02:59 np0005588920 podman[100542]: 2026-01-20 14:02:59.300517025 +0000 UTC m=+0.068931765 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, name=keepalived, build-date=2023-02-22T09:23:20, release=1793, io.openshift.expose-services=, architecture=x86_64, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, distribution-scope=public)
Jan 20 09:02:59 np0005588920 python3.9[100674]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:00.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:03:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:03:00 np0005588920 python3.9[100934]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:01 np0005588920 python3.9[101087]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:03:01 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 20 09:03:01 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 20 09:03:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:02.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:02.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:04.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:04.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:04 np0005588920 python3.9[101242]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:03:05 np0005588920 python3.9[101397]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:03:05 np0005588920 python3.9[101549]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:03:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:06.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:06.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:06 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Jan 20 09:03:06 np0005588920 python3.9[101726]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:03:06 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Jan 20 09:03:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:07 np0005588920 python3.9[101905]: ansible-service_facts Invoked
Jan 20 09:03:08 np0005588920 network[101922]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:03:08 np0005588920 network[101923]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:03:08 np0005588920 network[101924]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:03:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:08.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:08.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 20 09:03:08 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 20 09:03:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:10.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:10.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 20 09:03:10 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 20 09:03:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:12.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:12.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:14.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:14.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 20 09:03:15 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 20 09:03:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:16.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:16.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:16 np0005588920 python3.9[102430]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:03:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:17 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 20 09:03:17 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 20 09:03:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:18.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:18.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:19 np0005588920 python3.9[102585]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 20 09:03:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 20 09:03:19 np0005588920 ceph-osd[79820]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 20 09:03:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:20.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:20.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:21 np0005588920 python3.9[102738]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:22.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:22.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:22 np0005588920 python3.9[102816]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:23 np0005588920 python3.9[102969]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:23 np0005588920 python3.9[103047]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:24.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:24.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:25 np0005588920 python3.9[103200]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:26.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:26.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:27 np0005588920 python3.9[103352]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:03:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:28.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:28.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:28 np0005588920 python3.9[103437]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:03:29 np0005588920 systemd[1]: session-37.scope: Deactivated successfully.
Jan 20 09:03:29 np0005588920 systemd[1]: session-37.scope: Consumed 26.906s CPU time.
Jan 20 09:03:29 np0005588920 systemd-logind[783]: Session 37 logged out. Waiting for processes to exit.
Jan 20 09:03:29 np0005588920 systemd-logind[783]: Removed session 37.
Jan 20 09:03:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:30.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:32.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:32.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:34.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:34.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:34 np0005588920 systemd-logind[783]: New session 38 of user zuul.
Jan 20 09:03:34 np0005588920 systemd[1]: Started Session 38 of User zuul.
Jan 20 09:03:35 np0005588920 python3.9[103673]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:36.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:36.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:36 np0005588920 python3.9[103825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:37 np0005588920 python3.9[103903]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:37 np0005588920 systemd[1]: session-38.scope: Deactivated successfully.
Jan 20 09:03:37 np0005588920 systemd[1]: session-38.scope: Consumed 1.852s CPU time.
Jan 20 09:03:37 np0005588920 systemd-logind[783]: Session 38 logged out. Waiting for processes to exit.
Jan 20 09:03:37 np0005588920 systemd-logind[783]: Removed session 38.
Jan 20 09:03:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:38.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:38.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:40.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:40.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:42 np0005588920 systemd-logind[783]: New session 39 of user zuul.
Jan 20 09:03:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:42.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:42 np0005588920 systemd[1]: Started Session 39 of User zuul.
Jan 20 09:03:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:42.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.813038) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822813145, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2601, "num_deletes": 251, "total_data_size": 5170808, "memory_usage": 5243120, "flush_reason": "Manual Compaction"}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822851001, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3372780, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7058, "largest_seqno": 9654, "table_properties": {"data_size": 3362979, "index_size": 5655, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 25988, "raw_average_key_size": 21, "raw_value_size": 3340744, "raw_average_value_size": 2765, "num_data_blocks": 251, "num_entries": 1208, "num_filter_entries": 1208, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917647, "oldest_key_time": 1768917647, "file_creation_time": 1768917822, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 37997 microseconds, and 7912 cpu microseconds.
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.851050) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3372780 bytes OK
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.851067) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853560) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853602) EVENT_LOG_v1 {"time_micros": 1768917822853593, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853626) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5158794, prev total WAL file size 5158794, number of live WAL files 2.
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.854944) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3293KB)], [15(7541KB)]
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822855074, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11094945, "oldest_snapshot_seqno": -1}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3798 keys, 9488532 bytes, temperature: kUnknown
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822940083, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9488532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9457571, "index_size": 20355, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9541, "raw_key_size": 91658, "raw_average_key_size": 24, "raw_value_size": 9383472, "raw_average_value_size": 2470, "num_data_blocks": 890, "num_entries": 3798, "num_filter_entries": 3798, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768917822, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.940374) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9488532 bytes
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.941720) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.9 rd, 112.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.4 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(6.1) write-amplify(2.8) OK, records in: 4321, records dropped: 523 output_compression: NoCompression
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.941743) EVENT_LOG_v1 {"time_micros": 1768917822941732, "job": 6, "event": "compaction_finished", "compaction_time_micros": 84751, "compaction_time_cpu_micros": 44705, "output_level": 6, "num_output_files": 1, "total_output_size": 9488532, "num_input_records": 4321, "num_output_records": 3798, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822942525, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822944289, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.854790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.944376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.944383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.944385) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.944387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:03:42.944389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:43 np0005588920 python3.9[104088]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:03:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:44.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:44.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:44 np0005588920 python3.9[104244]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:45 np0005588920 python3.9[104420]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:46 np0005588920 python3.9[104498]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.5qq0_tfj recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:46.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:46.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:47 np0005588920 python3.9[104651]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:47 np0005588920 python3.9[104729]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ub6f25k_ recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:48.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:48.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:48 np0005588920 python3.9[104881]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:49 np0005588920 python3.9[105034]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:50 np0005588920 python3.9[105112]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:50.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:50.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:50 np0005588920 python3.9[105264]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:51 np0005588920 python3.9[105343]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:52.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:52.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:52 np0005588920 python3.9[105545]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:53 np0005588920 python3.9[105698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:53 np0005588920 python3.9[105776]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:54.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:54.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:54 np0005588920 python3.9[105928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:55 np0005588920 python3.9[106007]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:56.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:56.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:56 np0005588920 python3.9[106159]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:03:56 np0005588920 systemd[1]: Reloading.
Jan 20 09:03:56 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:03:56 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:03:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:57 np0005588920 python3.9[106351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:58.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:03:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:58.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:58 np0005588920 python3.9[106429]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:59 np0005588920 python3.9[106582]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:59 np0005588920 python3.9[106660]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:00.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:00.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:00 np0005588920 python3.9[106812]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:04:00 np0005588920 systemd[1]: Reloading.
Jan 20 09:04:00 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:04:00 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:04:01 np0005588920 systemd[1]: Starting Create netns directory...
Jan 20 09:04:01 np0005588920 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:04:01 np0005588920 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:04:01 np0005588920 systemd[1]: Finished Create netns directory.
Jan 20 09:04:01 np0005588920 python3.9[107006]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:04:02 np0005588920 network[107023]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:04:02 np0005588920 network[107024]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:04:02 np0005588920 network[107025]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:04:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:02.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:02.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:04.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:04.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:06.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:07 np0005588920 podman[107326]: 2026-01-20 14:04:07.879956731 +0000 UTC m=+0.063840104 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:04:08 np0005588920 podman[107326]: 2026-01-20 14:04:08.006643818 +0000 UTC m=+0.190527181 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:04:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:08.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:08 np0005588920 podman[107484]: 2026-01-20 14:04:08.701397235 +0000 UTC m=+0.074412370 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:04:08 np0005588920 podman[107484]: 2026-01-20 14:04:08.71334761 +0000 UTC m=+0.086362745 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:04:09 np0005588920 podman[107604]: 2026-01-20 14:04:09.070468428 +0000 UTC m=+0.083764773 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, release=1793, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, name=keepalived, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, version=2.2.4)
Jan 20 09:04:09 np0005588920 podman[107604]: 2026-01-20 14:04:09.087549027 +0000 UTC m=+0.100845362 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.openshift.expose-services=, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, distribution-scope=public, release=1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9)
Jan 20 09:04:09 np0005588920 python3.9[107785]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:04:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:04:10 np0005588920 python3.9[107909]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:10.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:10.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:11 np0005588920 python3.9[108074]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:12.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:12.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:12 np0005588920 python3.9[108275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:12 np0005588920 python3.9[108354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:14.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:14.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:14 np0005588920 python3.9[108507]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 09:04:14 np0005588920 systemd[1]: Starting Time & Date Service...
Jan 20 09:04:14 np0005588920 systemd[1]: Started Time & Date Service.
Jan 20 09:04:15 np0005588920 python3.9[108664]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:16.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:16.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:16 np0005588920 python3.9[108816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:17 np0005588920 python3.9[108945]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:18 np0005588920 python3.9[109097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:18.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:18.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:18 np0005588920 python3.9[109175]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.d48hwjny recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:19 np0005588920 python3.9[109328]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:19 np0005588920 python3.9[109406]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:20.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:20 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:20.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:20 np0005588920 python3.9[109558]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:22 np0005588920 python3[109712]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:04:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:22 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:22.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:22.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:23 np0005588920 python3.9[109864]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:23 np0005588920 python3.9[109943]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:24.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:24 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:24.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:24 np0005588920 python3.9[110095]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:25 np0005588920 python3.9[110220]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917863.7805557-902-49123509931368/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:26 np0005588920 python3.9[110373]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:26.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:26 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:26.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:26 np0005588920 python3.9[110451]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:27 np0005588920 python3.9[110604]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:27 np0005588920 python3.9[110682]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:28.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:28 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:28.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:28 np0005588920 python3.9[110834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:29 np0005588920 python3.9[110913]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:30.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:30.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:30 np0005588920 python3.9[111065]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:31 np0005588920 python3.9[111221]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:32 np0005588920 python3.9[111373]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:32.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:32.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:32 np0005588920 python3.9[111575]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:33 np0005588920 python3.9[111728]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 09:04:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:34.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:34.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:34 np0005588920 python3.9[111880]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 09:04:35 np0005588920 systemd[1]: session-39.scope: Deactivated successfully.
Jan 20 09:04:35 np0005588920 systemd[1]: session-39.scope: Consumed 36.274s CPU time.
Jan 20 09:04:35 np0005588920 systemd-logind[783]: Session 39 logged out. Waiting for processes to exit.
Jan 20 09:04:35 np0005588920 systemd-logind[783]: Removed session 39.
Jan 20 09:04:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:36.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:36.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:38 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:38.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:38.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:40.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:40 np0005588920 systemd-logind[783]: New session 40 of user zuul.
Jan 20 09:04:40 np0005588920 systemd[1]: Started Session 40 of User zuul.
Jan 20 09:04:41 np0005588920 python3.9[112064]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 20 09:04:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:42.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:42 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:42 np0005588920 python3.9[112216]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:04:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:44 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:44.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 09:04:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 09:04:44 np0005588920 python3.9[112371]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 20 09:04:44 np0005588920 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 09:04:45 np0005588920 python3.9[112527]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.tlayz19d follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:46.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:46.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:46 np0005588920 python3.9[112652]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.tlayz19d mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917884.622171-109-197453733897319/.source.tlayz19d _original_basename=.mhixoxt7 follow=False checksum=309fed797bdebad351617b1a1ea9eb224966ee92 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:47 np0005588920 python3.9[112805]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.077650) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888077796, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 873, "num_deletes": 250, "total_data_size": 1772558, "memory_usage": 1798896, "flush_reason": "Manual Compaction"}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888184555, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 757928, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9659, "largest_seqno": 10527, "table_properties": {"data_size": 754473, "index_size": 1235, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8879, "raw_average_key_size": 20, "raw_value_size": 747181, "raw_average_value_size": 1690, "num_data_blocks": 54, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917823, "oldest_key_time": 1768917823, "file_creation_time": 1768917888, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 106978 microseconds, and 5362 cpu microseconds.
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:04:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:48.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:48.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.184642) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 757928 bytes OK
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.184666) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.308813) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.309003) EVENT_LOG_v1 {"time_micros": 1768917888308987, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.309050) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1768093, prev total WAL file size 1768093, number of live WAL files 2.
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.310881) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(740KB)], [18(9266KB)]
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888310970, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10246460, "oldest_snapshot_seqno": -1}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3752 keys, 7632986 bytes, temperature: kUnknown
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888503169, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7632986, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7605238, "index_size": 17270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9413, "raw_key_size": 91230, "raw_average_key_size": 24, "raw_value_size": 7534732, "raw_average_value_size": 2008, "num_data_blocks": 754, "num_entries": 3752, "num_filter_entries": 3752, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768917888, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.503564) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7632986 bytes
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.520870) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 53.3 rd, 39.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.0 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(23.6) write-amplify(10.1) OK, records in: 4240, records dropped: 488 output_compression: NoCompression
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.520925) EVENT_LOG_v1 {"time_micros": 1768917888520902, "job": 8, "event": "compaction_finished", "compaction_time_micros": 192348, "compaction_time_cpu_micros": 24141, "output_level": 6, "num_output_files": 1, "total_output_size": 7632986, "num_input_records": 4240, "num_output_records": 3752, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888521424, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888524821, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.310741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.524950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.524961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.524964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.524967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:04:48.524972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588920 python3.9[112957]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrCUasX8PhlctvvIb2eE6+Z0hELmfczQ6UoBD+mPtCobptr/s786JmwJ3D8nIoKhlCLVSmhRfbqf1Pm45RUPTEtSuaa6HBDy40dZhTXU34X4KbGfKmur2bp9S/1w83ArKvI8inSqqk2qoMx1l7ECkEgeT+GbFwKfYLnbq5OV4Ms3tzl/uFUC/Xzxs2dbXlhozQiSamcO/a6EObErTvR8PrtaOoLFtTiD/I+oN+rkdBPkBc6r0qT4jS7nU1FOlT96meSZHE7Q1n8pxcy9PEc8w9hFdd1Zj8/WcGIdeEJsekuouK1Lut/sofQLZHyUMWJTcnBjx8BsjGx9NjUHPYUWIw+DZo7lT2QurAPNnaX4rp9ciGV2Bdm3ylNoOu3izNvM1JGTw3xRyYrmyxyWv3Euc35JXa0w07Xrqr+6Ckih0WTLU6q3Rlnrc/grpDC821sHrsljerHipJVOCbZB39LvV6wDDBlqfYZzfqID3dIqlVli4eL12J0K7jr7QAlPRhNf0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOG07miJwhzuA/nm0wvGIorydl2xbBiiDhE7PypnJ/jC#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKiJpWtps/bRsuEHfak4zDuqPHKOWFLaEA2h86H7tPlrZHR8okAVZWCmY7keO3Ad1DFyffUtJPKv5OvTK91xGO8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/dEYtIJ/delwiq9xMMctU8myoGU/TKMiFUM+i3BSaGKrC0rujad6qo1LAtjth5aYbBcgBhxy0UEX0oCruQQgc5qDpPmWHmJiAwdQJaDu6GxTRl3PlXF2u4rd0Rz72DAMuCxPSYedeHU91uL4vlrcD95xONWew2wa9lUuqQWdgj8DtqnB9T895BihDk9vFLXAaoGJcYZVGKJmXR8sOzNTFQxefqstVO0/dfbRUyFd0Ukp5v7rTmLxw0Np5WcGMOg9l/iRzWTopxnTRvXpBoGlFCmzNvTG2uH08dJ4FU5Wk9/iSxonuiVJu9DKs8Tp4EajaA4Y6cEuZiMhhqi7vw6zVCQuCmRBpny6Ub1Ag2CesMYgxwOVJO5cHsKh3BzuPFsh1gMgrrZK7v+qfm2r1rhHlPsCWrcnrtUIZa7gyzdFvHytTh/4uyGMgNpbwxkyCxgSN4PleQy2wvxy/DFW+JxCDzI4jK9LFH5aojzEhUtj+P3E7CXL/wRPxDJdfEU6PhTk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEa1zL0TUD00vr72wZq3y4rgtSnctWBvs+gME/0/EAsV#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI2WwWe4rQW0CaFwcmci1J5n144T87fcxCH+Y2CVZd5XQ7Cvzlhh1cGNDX81Tng3KgxvKOuz3mdiSCLqx8noiD0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/3N9PJZXpat0uFh2x2RoV9B0Ih74HU9CPf+g/5HncM7gCVvCpW3CBde1qNDRU2iY9rzpOVPwzi4YzoAUcxB5KAiqZOI9ylmzfiD8JXQ+myLmIRLxHOdXFaEQ4mMp4W+X37hCZ6sdfm6Yqd6eqBuZrM/72ltYoewWBNCG/Hgqzu30L9WC4+BF+iADHT7Qnmvh/cc9U71WxB4h2ikBo1SdGoFCqoez7ajitqx+dw7VWaOtEPliS0LZuDtN3Zt/cBBgxhb/FaAEI3jRP2ej9X0NJW91YxzBygyxiVasslx92g/GmnDFOWVZb5ai/JJsNH6pLTjs25IzvnuWIf8/ZLgZ03zziR4mBLP12CIVF8g1CzaqK1IILDKkjS/dzDiTBefmiQ2+N0i5EEXOgmxchqOqTkFPQg/ar0+0uBPkwzAI0HDk99czhyYHFlO+PhnULVkL1z+XLwHBgOrbNNVQQcJCvady4Gadh66mu1UrLpryNYOgZiugZi67Biha4ZPzPHok=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF41dx3BXAuEvQwQNtbUM7rIrbaOLr5CRvYNdDD+UMr9#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENFrTpm22/xEaEJMzd7C5WyJttJdK+HK5kxP8/NuvvAQSlLtEulBZnvD/OX5hk3/sDYhPQelj3YsNX1Plw5PJQ=#012 create=True mode=0644 path=/tmp/ansible.tlayz19d state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:49 np0005588920 python3.9[113110]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.tlayz19d' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:50 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:50.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:50.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:50 np0005588920 python3.9[113264]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.tlayz19d state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:51 np0005588920 systemd[1]: session-40.scope: Deactivated successfully.
Jan 20 09:04:51 np0005588920 systemd[1]: session-40.scope: Consumed 6.068s CPU time.
Jan 20 09:04:51 np0005588920 systemd-logind[783]: Session 40 logged out. Waiting for processes to exit.
Jan 20 09:04:51 np0005588920 systemd-logind[783]: Removed session 40.
Jan 20 09:04:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:04:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:52 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:52.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:52.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:54.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:54.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:56 np0005588920 systemd-logind[783]: New session 41 of user zuul.
Jan 20 09:04:56 np0005588920 systemd[1]: Started Session 41 of User zuul.
Jan 20 09:04:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:56.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:57 np0005588920 python3.9[113495]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:04:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:04:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:04:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:04:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:58.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:58 np0005588920 python3.9[113652]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 09:04:59 np0005588920 python3.9[113807]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:05:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:00.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:00.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:00 np0005588920 python3.9[113960]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:05:01 np0005588920 python3.9[114114]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:02.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:02.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:02 np0005588920 python3.9[114266]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:03 np0005588920 systemd-logind[783]: Session 41 logged out. Waiting for processes to exit.
Jan 20 09:05:03 np0005588920 systemd[1]: session-41.scope: Deactivated successfully.
Jan 20 09:05:03 np0005588920 systemd[1]: session-41.scope: Consumed 4.534s CPU time.
Jan 20 09:05:03 np0005588920 systemd-logind[783]: Removed session 41.
Jan 20 09:05:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:04.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:06.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:08.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:08 np0005588920 systemd-logind[783]: New session 42 of user zuul.
Jan 20 09:05:08 np0005588920 systemd[1]: Started Session 42 of User zuul.
Jan 20 09:05:09 np0005588920 python3.9[114448]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:05:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:10.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:10.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:10 np0005588920 python3.9[114604]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:05:11 np0005588920 python3.9[114689]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 09:05:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:12.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:12.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:13 np0005588920 python3.9[114891]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:05:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:14.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:15 np0005588920 python3.9[115043]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:05:16 np0005588920 python3.9[115193]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:16.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:16.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:16 np0005588920 python3.9[115343]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:17 np0005588920 systemd[1]: session-42.scope: Deactivated successfully.
Jan 20 09:05:17 np0005588920 systemd[1]: session-42.scope: Consumed 6.194s CPU time.
Jan 20 09:05:17 np0005588920 systemd-logind[783]: Session 42 logged out. Waiting for processes to exit.
Jan 20 09:05:17 np0005588920 systemd-logind[783]: Removed session 42.
Jan 20 09:05:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:18.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:18.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:18 np0005588920 podman[115542]: 2026-01-20 14:05:18.307928114 +0000 UTC m=+0.097882318 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 20 09:05:18 np0005588920 podman[115542]: 2026-01-20 14:05:18.423023014 +0000 UTC m=+0.212977218 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 09:05:19 np0005588920 podman[115701]: 2026-01-20 14:05:19.369143885 +0000 UTC m=+0.084254165 container exec c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:05:19 np0005588920 podman[115701]: 2026-01-20 14:05:19.383664363 +0000 UTC m=+0.098774593 container exec_died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:05:19 np0005588920 podman[115766]: 2026-01-20 14:05:19.673983636 +0000 UTC m=+0.071006424 container exec 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., name=keepalived)
Jan 20 09:05:19 np0005588920 podman[115766]: 2026-01-20 14:05:19.693620431 +0000 UTC m=+0.090643199 container exec_died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.openshift.expose-services=)
Jan 20 09:05:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:20.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:20.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:21 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:05:21 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:21 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:05:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:22.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:22.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:23 np0005588920 systemd-logind[783]: New session 43 of user zuul.
Jan 20 09:05:23 np0005588920 systemd[1]: Started Session 43 of User zuul.
Jan 20 09:05:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:24.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:24.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:24 np0005588920 python3.9[116085]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:05:26 np0005588920 python3.9[116242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:26.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:26.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:26 np0005588920 python3.9[116394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:27 np0005588920 python3.9[116597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:28.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:28 np0005588920 python3.9[116720]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917927.118267-161-198501354519544/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=0914f98eec0b702cb4c053a66ad838cc6c30a920 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:29 np0005588920 python3.9[116873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:29 np0005588920 python3.9[116996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917928.6931489-161-65837528993/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=3f5ad343c2ed5cd826e6179427db625573e3eee3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:30.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:30 np0005588920 python3.9[117148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:31 np0005588920 python3.9[117272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917930.122355-161-166024013127511/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=24fb49672634657839507b0fd8864737c3bdbedd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:32.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:32 np0005588920 python3.9[117424]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:32.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:32 np0005588920 python3.9[117626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:33 np0005588920 python3.9[117779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:34.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:34 np0005588920 python3.9[117904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917933.2348475-342-265266401559222/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=5ef23575b8ebe67d1e356e824a7b816dbf37b908 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:35 np0005588920 python3.9[118056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:35 np0005588920 python3.9[118180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917934.6391375-342-149641145986720/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=dbd41a175def1218d1038733ac1d1fb38abc7be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:36.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:36.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:36 np0005588920 python3.9[118332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:37 np0005588920 python3.9[118455]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917935.8526175-342-279592463491439/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=2388e64b64dfe00c52d031ef0c00e67b3f8427f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:37 np0005588920 python3.9[118608]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:38.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:38.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:38 np0005588920 python3.9[118760]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:39 np0005588920 python3.9[118913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:40 np0005588920 python3.9[119036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917938.8941076-527-152313981265615/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=131f75b5fb571a6003e1f021e0a8c0e05e844eaf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:40.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:40.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:40 np0005588920 python3.9[119188]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:41 np0005588920 python3.9[119312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917940.3544157-527-63259128191777/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=dbd41a175def1218d1038733ac1d1fb38abc7be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:42 np0005588920 python3.9[119464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:05:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:42.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:42 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:42.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:42 np0005588920 python3.9[119587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917941.6874993-527-77760232473501/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=a4350f7ef2e49c81c147dbb049aa6134986ee2f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:44 np0005588920 python3.9[119740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:44.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:44.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:44 np0005588920 python3.9[119892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:45 np0005588920 python3.9[120016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917944.3318033-735-166942988531299/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:46.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:46.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:46 np0005588920 python3.9[120168]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:47 np0005588920 python3.9[120321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:47 np0005588920 python3.9[120444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917946.7054741-810-155472580871931/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:48.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:48 np0005588920 python3.9[120596]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:49 np0005588920 python3.9[120749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:50 np0005588920 python3.9[120872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917948.8617866-879-131919058309652/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:50.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:50 np0005588920 python3.9[121024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:51 np0005588920 python3.9[121177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:52.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:52.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:52 np0005588920 python3.9[121300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917951.1841605-951-10100588302086/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:53 np0005588920 python3.9[121503]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:54 np0005588920 python3.9[121655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:54.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:54.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:54 np0005588920 python3.9[121778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917953.5057528-1025-5250623715099/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:55 np0005588920 python3.9[121931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:56.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:56.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:56 np0005588920 python3.9[122083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:57 np0005588920 python3.9[122207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917955.8913097-1087-102025312785939/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:57 np0005588920 systemd[1]: session-43.scope: Deactivated successfully.
Jan 20 09:05:57 np0005588920 systemd[1]: session-43.scope: Consumed 26.864s CPU time.
Jan 20 09:05:57 np0005588920 systemd-logind[783]: Session 43 logged out. Waiting for processes to exit.
Jan 20 09:05:57 np0005588920 systemd-logind[783]: Removed session 43.
Jan 20 09:05:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:58.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:05:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:58.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:00.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:00.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:02.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:02.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:02 np0005588920 systemd-logind[783]: New session 44 of user zuul.
Jan 20 09:06:02 np0005588920 systemd[1]: Started Session 44 of User zuul.
Jan 20 09:06:03 np0005588920 python3.9[122391]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:04.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:04.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:04 np0005588920 python3.9[122543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:05 np0005588920 python3.9[122667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917964.1052063-64-21231641307588/.source.conf _original_basename=ceph.conf follow=False checksum=906e2ddae7738a5e2d5bcdd5b659f6884e758b17 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:06 np0005588920 python3.9[122819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:06.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:06.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:06 np0005588920 python3.9[122942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917965.6900527-64-130507892946783/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=ddae6cb53c02baaa87ed0e28941db377a2638775 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:07 np0005588920 systemd[1]: session-44.scope: Deactivated successfully.
Jan 20 09:06:07 np0005588920 systemd[1]: session-44.scope: Consumed 3.340s CPU time.
Jan 20 09:06:07 np0005588920 systemd-logind[783]: Session 44 logged out. Waiting for processes to exit.
Jan 20 09:06:07 np0005588920 systemd-logind[783]: Removed session 44.
Jan 20 09:06:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:08.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:08.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:10.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:10.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:12.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:12.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:13 np0005588920 systemd-logind[783]: New session 45 of user zuul.
Jan 20 09:06:13 np0005588920 systemd[1]: Started Session 45 of User zuul.
Jan 20 09:06:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:14.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:14 np0005588920 python3.9[123174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:16.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:16.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:16 np0005588920 python3.9[123331]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:17 np0005588920 python3.9[123484]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:18.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:18.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:18 np0005588920 python3.9[123634]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:19 np0005588920 python3.9[123787]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 20 09:06:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:20.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:20.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:21 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 20 09:06:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:22 np0005588920 python3.9[123944]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:06:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:22.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:22.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:23 np0005588920 python3.9[124029]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:06:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:24.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:24.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:25 np0005588920 python3.9[124183]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:06:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:26.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:26.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:26 np0005588920 python3[124338]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 20 09:06:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:27 np0005588920 python3.9[124591]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:28.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:28 np0005588920 python3.9[124774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:06:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:06:29 np0005588920 python3.9[124853]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:30 np0005588920 python3.9[125005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:30.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:30.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:30 np0005588920 python3.9[125083]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mpromsoy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:31 np0005588920 python3.9[125236]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:32 np0005588920 python3.9[125314]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:32.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:32.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:33 np0005588920 python3.9[125496]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:34.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:34 np0005588920 python3[125670]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:06:35 np0005588920 python3.9[125823]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:36 np0005588920 python3.9[125998]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917994.8852441-434-174614402806960/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:36.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:36.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:37 np0005588920 python3.9[126150]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:37 np0005588920 python3.9[126276]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917996.5069804-479-264814382768921/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:38.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:38.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:38 np0005588920 python3.9[126428]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:39 np0005588920 python3.9[126554]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917998.225214-523-130359612462154/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:40.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:40.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:40 np0005588920 python3.9[126706]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:41 np0005588920 python3.9[126832]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917999.798222-568-31378354224874/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:42 np0005588920 python3.9[126984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:42.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:42.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:42 np0005588920 python3.9[127109]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918001.4276135-613-273227700704193/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:43 np0005588920 python3.9[127262]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:44.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:44.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:44 np0005588920 python3.9[127414]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:45 np0005588920 python3.9[127570]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:46.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:46 np0005588920 python3.9[127722]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:47 np0005588920 python3.9[127876]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:06:48 np0005588920 python3.9[128030]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:48.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:48.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:49 np0005588920 python3.9[128185]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:50.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:50.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:50 np0005588920 python3.9[128336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:52 np0005588920 python3.9[128490]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:52 np0005588920 ovs-vsctl[128491]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 20 09:06:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:52.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:52 np0005588920 python3.9[128643]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:53 np0005588920 python3.9[128807]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:53 np0005588920 ovs-vsctl[128850]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 20 09:06:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:54.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:54.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:54 np0005588920 python3.9[129000]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:06:55 np0005588920 python3.9[129155]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:56.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:56.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:56 np0005588920 python3.9[129307]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:57 np0005588920 python3.9[129386]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:57 np0005588920 python3.9[129538]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:58 np0005588920 python3.9[129616]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:58.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:06:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:58.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:59 np0005588920 python3.9[129769]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:00 np0005588920 python3.9[129921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:00.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:00.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:00 np0005588920 python3.9[129999]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:01 np0005588920 python3.9[130152]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:02 np0005588920 python3.9[130230]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:02.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:02.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:02 np0005588920 python3.9[130382]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:02 np0005588920 systemd[1]: Reloading.
Jan 20 09:07:03 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:03 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:04 np0005588920 python3.9[130573]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:04.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:04.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:04 np0005588920 python3.9[130651]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:05 np0005588920 python3.9[130804]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:06 np0005588920 python3.9[130882]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:06.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:06.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:07 np0005588920 python3.9[131034]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:07 np0005588920 systemd[1]: Reloading.
Jan 20 09:07:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:07 np0005588920 systemd[1]: Starting Create netns directory...
Jan 20 09:07:07 np0005588920 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:07:07 np0005588920 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:07:07 np0005588920 systemd[1]: Finished Create netns directory.
Jan 20 09:07:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:08.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:08.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:08 np0005588920 python3.9[131229]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:09 np0005588920 python3.9[131382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:10 np0005588920 python3.9[131505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918028.7598526-1366-203755411823739/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:11 np0005588920 python3.9[131658]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:12 np0005588920 python3.9[131810]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:12.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:12.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:12 np0005588920 python3.9[131962]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:13 np0005588920 python3.9[132086]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918032.2981057-1465-247678893377523/.source.json _original_basename=.zs6tkqft follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:14 np0005588920 python3.9[132286]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:14.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:14.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:16.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:16.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:17 np0005588920 python3.9[132710]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 20 09:07:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:18 np0005588920 python3.9[132863]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:07:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:18.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:18.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:19 np0005588920 python3[133016]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:07:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:20.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:20.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:22.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:22.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:24.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:24.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:25 np0005588920 podman[133029]: 2026-01-20 14:07:25.364128701 +0000 UTC m=+5.576329077 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:25 np0005588920 podman[133154]: 2026-01-20 14:07:25.538399448 +0000 UTC m=+0.072749214 container create 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:07:25 np0005588920 podman[133154]: 2026-01-20 14:07:25.502487639 +0000 UTC m=+0.036837475 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:25 np0005588920 python3[133016]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:26.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:07:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:26.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:07:26 np0005588920 python3.9[133344]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:07:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:27 np0005588920 python3.9[133499]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:28 np0005588920 python3.9[133575]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:07:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:28.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:28.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:29 np0005588920 python3.9[133726]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918048.2042556-1699-107355917339328/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:29 np0005588920 python3.9[133803]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:07:29 np0005588920 systemd[1]: Reloading.
Jan 20 09:07:29 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:29 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:30.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:30.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:30 np0005588920 python3.9[133914]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:30 np0005588920 systemd[1]: Reloading.
Jan 20 09:07:30 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:30 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.052759) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051052877, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1710, "num_deletes": 251, "total_data_size": 4217162, "memory_usage": 4271728, "flush_reason": "Manual Compaction"}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051075367, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2763657, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10532, "largest_seqno": 12237, "table_properties": {"data_size": 2756564, "index_size": 4164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 13986, "raw_average_key_size": 19, "raw_value_size": 2742491, "raw_average_value_size": 3793, "num_data_blocks": 188, "num_entries": 723, "num_filter_entries": 723, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917888, "oldest_key_time": 1768917888, "file_creation_time": 1768918051, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 22683 microseconds, and 12629 cpu microseconds.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.075439) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2763657 bytes OK
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.075468) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.077572) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.077596) EVENT_LOG_v1 {"time_micros": 1768918051077589, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.077617) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4209477, prev total WAL file size 4209477, number of live WAL files 2.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.079584) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2698KB)], [21(7454KB)]
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051079674, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10396643, "oldest_snapshot_seqno": -1}
Jan 20 09:07:31 np0005588920 systemd[1]: Starting ovn_controller container...
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3958 keys, 8245210 bytes, temperature: kUnknown
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051173280, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8245210, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8216333, "index_size": 17887, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9925, "raw_key_size": 96148, "raw_average_key_size": 24, "raw_value_size": 8142356, "raw_average_value_size": 2057, "num_data_blocks": 773, "num_entries": 3958, "num_filter_entries": 3958, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918051, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.173559) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8245210 bytes
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.175319) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.9 rd, 88.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 7.3 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(6.7) write-amplify(3.0) OK, records in: 4475, records dropped: 517 output_compression: NoCompression
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.175345) EVENT_LOG_v1 {"time_micros": 1768918051175334, "job": 10, "event": "compaction_finished", "compaction_time_micros": 93718, "compaction_time_cpu_micros": 30453, "output_level": 6, "num_output_files": 1, "total_output_size": 8245210, "num_input_records": 4475, "num_output_records": 3958, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051176020, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051177743, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.079482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.177841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.177850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.177854) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.177859) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:07:31.177863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:07:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6b32a5abd51cb7f2ebf46104cc78621f7e142c9b1c3707eb0f3df714914229f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 20 09:07:31 np0005588920 systemd[1]: Started /usr/bin/podman healthcheck run 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc.
Jan 20 09:07:31 np0005588920 podman[133956]: 2026-01-20 14:07:31.303341491 +0000 UTC m=+0.158618233 container init 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + sudo -E kolla_set_configs
Jan 20 09:07:31 np0005588920 podman[133956]: 2026-01-20 14:07:31.332667896 +0000 UTC m=+0.187944618 container start 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 20 09:07:31 np0005588920 edpm-start-podman-container[133956]: ovn_controller
Jan 20 09:07:31 np0005588920 systemd[1]: Created slice User Slice of UID 0.
Jan 20 09:07:31 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 20 09:07:31 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 20 09:07:31 np0005588920 systemd[1]: Starting User Manager for UID 0...
Jan 20 09:07:31 np0005588920 edpm-start-podman-container[133955]: Creating additional drop-in dependency for "ovn_controller" (29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc)
Jan 20 09:07:31 np0005588920 podman[133978]: 2026-01-20 14:07:31.431263428 +0000 UTC m=+0.077628680 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:07:31 np0005588920 systemd[1]: 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc-4c2c3af401970184.service: Main process exited, code=exited, status=1/FAILURE
Jan 20 09:07:31 np0005588920 systemd[1]: 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc-4c2c3af401970184.service: Failed with result 'exit-code'.
Jan 20 09:07:31 np0005588920 systemd[1]: Reloading.
Jan 20 09:07:31 np0005588920 systemd[134003]: Queued start job for default target Main User Target.
Jan 20 09:07:31 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:31 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:31 np0005588920 systemd[134003]: Created slice User Application Slice.
Jan 20 09:07:31 np0005588920 systemd[134003]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 20 09:07:31 np0005588920 systemd[134003]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:07:31 np0005588920 systemd[134003]: Reached target Paths.
Jan 20 09:07:31 np0005588920 systemd[134003]: Reached target Timers.
Jan 20 09:07:31 np0005588920 systemd[134003]: Starting D-Bus User Message Bus Socket...
Jan 20 09:07:31 np0005588920 systemd[134003]: Starting Create User's Volatile Files and Directories...
Jan 20 09:07:31 np0005588920 systemd[134003]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:07:31 np0005588920 systemd[134003]: Reached target Sockets.
Jan 20 09:07:31 np0005588920 systemd[134003]: Finished Create User's Volatile Files and Directories.
Jan 20 09:07:31 np0005588920 systemd[134003]: Reached target Basic System.
Jan 20 09:07:31 np0005588920 systemd[134003]: Reached target Main User Target.
Jan 20 09:07:31 np0005588920 systemd[134003]: Startup finished in 149ms.
Jan 20 09:07:31 np0005588920 systemd[1]: Started User Manager for UID 0.
Jan 20 09:07:31 np0005588920 systemd[1]: Started ovn_controller container.
Jan 20 09:07:31 np0005588920 systemd[1]: Started Session c1 of User root.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: INFO:__main__:Validating config file
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: INFO:__main__:Writing out command to execute
Jan 20 09:07:31 np0005588920 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: ++ cat /run_command
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + ARGS=
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + sudo kolla_copy_cacerts
Jan 20 09:07:31 np0005588920 systemd[1]: Started Session c2 of User root.
Jan 20 09:07:31 np0005588920 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + [[ ! -n '' ]]
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + . kolla_extend_start
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + umask 0022
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.8918] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.8928] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 09:07:31 np0005588920 kernel: br-int: entered promiscuous mode
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <warn>  [1768918051.8931] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 09:07:31 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.8941] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.8946] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.8950] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.9283] manager: (ovn-920572-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 20 09:07:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:07:31Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.9290] manager: (ovn-367c1a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.9295] manager: (ovn-5ffd4a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 20 09:07:31 np0005588920 systemd-udevd[134106]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:07:31 np0005588920 systemd-udevd[134109]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:07:31 np0005588920 kernel: genev_sys_6081: entered promiscuous mode
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.9507] device (genev_sys_6081): carrier: link connected
Jan 20 09:07:31 np0005588920 NetworkManager[49076]: <info>  [1768918051.9510] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Jan 20 09:07:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:32.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:32.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:32 np0005588920 python3.9[134237]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 20 09:07:34 np0005588920 python3.9[134390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:34.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:34.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:34 np0005588920 python3.9[134563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918053.5887656-1835-96764961790160/.source.yaml _original_basename=.phoofkiw follow=False checksum=aedaf657c77fb1feab67c7335f83a0d24eed0971 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:35 np0005588920 python3.9[134716]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:35 np0005588920 ovs-vsctl[134717]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 20 09:07:36 np0005588920 python3.9[134984]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:36.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:36 np0005588920 ovs-vsctl[135017]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 20 09:07:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:36.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:37 np0005588920 python3.9[135278]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:37 np0005588920 ovs-vsctl[135279]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 20 09:07:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:07:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:07:38 np0005588920 systemd[1]: session-45.scope: Deactivated successfully.
Jan 20 09:07:38 np0005588920 systemd[1]: session-45.scope: Consumed 1min 4.877s CPU time.
Jan 20 09:07:38 np0005588920 systemd-logind[783]: Session 45 logged out. Waiting for processes to exit.
Jan 20 09:07:38 np0005588920 systemd-logind[783]: Removed session 45.
Jan 20 09:07:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:38.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:38.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 20 09:07:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:07:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:07:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:40.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:41 np0005588920 systemd[1]: Stopping User Manager for UID 0...
Jan 20 09:07:41 np0005588920 systemd[134003]: Activating special unit Exit the Session...
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped target Main User Target.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped target Basic System.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped target Paths.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped target Sockets.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped target Timers.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:07:41 np0005588920 systemd[134003]: Closed D-Bus User Message Bus Socket.
Jan 20 09:07:41 np0005588920 systemd[134003]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:07:41 np0005588920 systemd[134003]: Removed slice User Application Slice.
Jan 20 09:07:41 np0005588920 systemd[134003]: Reached target Shutdown.
Jan 20 09:07:41 np0005588920 systemd[134003]: Finished Exit the Session.
Jan 20 09:07:41 np0005588920 systemd[134003]: Reached target Exit the Session.
Jan 20 09:07:41 np0005588920 systemd[1]: user@0.service: Deactivated successfully.
Jan 20 09:07:41 np0005588920 systemd[1]: Stopped User Manager for UID 0.
Jan 20 09:07:41 np0005588920 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 20 09:07:41 np0005588920 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 20 09:07:41 np0005588920 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 20 09:07:41 np0005588920 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 20 09:07:41 np0005588920 systemd[1]: Removed slice User Slice of UID 0.
Jan 20 09:07:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:42.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:42.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:44.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:44.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:44 np0005588920 systemd-logind[783]: New session 47 of user zuul.
Jan 20 09:07:45 np0005588920 systemd[1]: Started Session 47 of User zuul.
Jan 20 09:07:46 np0005588920 python3.9[135514]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:07:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:46.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:46.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:47 np0005588920 python3.9[135671]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:48 np0005588920 python3.9[135823]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:48.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:48.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:49 np0005588920 python3.9[135976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:50 np0005588920 python3.9[136128]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:07:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:50.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:07:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:50.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:50 np0005588920 python3.9[136281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:51 np0005588920 python3.9[136432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:07:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:07:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2177 writes, 12K keys, 2177 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2177 writes, 2177 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2177 writes, 12K keys, 2177 commit groups, 1.0 writes per commit group, ingest: 23.64 MB, 0.04 MB/s#012Interval WAL: 2177 writes, 2177 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     49.6      0.28              0.06         5    0.056       0      0       0.0       0.0#012  L6      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.3     77.7     65.1      0.48              0.13         4    0.121     16K   1786       0.0       0.0#012 Sum      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     49.1     59.4      0.77              0.19         9    0.085     16K   1786       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     49.3     59.5      0.76              0.19         8    0.096     16K   1786       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     77.7     65.1      0.48              0.13         4    0.121     16K   1786       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     49.9      0.28              0.06         4    0.070       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.8 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 308.00 MB usage: 1.29 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(62,1.13 MB,0.365681%) FilterBlock(9,53.86 KB,0.017077%) IndexBlock(9,112.48 KB,0.0356649%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:07:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:52.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:52.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:52 np0005588920 python3.9[136584]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 20 09:07:54 np0005588920 python3.9[136735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:54.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:54.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:54 np0005588920 python3.9[136906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918073.619843-220-118645865113910/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:55 np0005588920 python3.9[137057]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:56.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:56 np0005588920 python3.9[137178]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918075.3012698-265-22123203938546/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:07:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:56.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:07:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:57 np0005588920 python3.9[137331]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:07:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:58 np0005588920 python3.9[137415]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:07:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:07:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:58.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:00.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:01 np0005588920 python3.9[137570]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:08:02Z|00025|memory|INFO|17280 kB peak resident set size after 30.1 seconds
Jan 20 09:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:08:02Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 20 09:08:02 np0005588920 podman[137680]: 2026-01-20 14:08:02.085668547 +0000 UTC m=+0.164211998 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 20 09:08:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:02 np0005588920 python3.9[137744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:02.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:02.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:02 np0005588920 python3.9[137871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918081.6473813-377-187001456390941/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:03 np0005588920 python3.9[138022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:04 np0005588920 python3.9[138143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918083.0475848-377-147043026841665/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:04.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:04.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:05 np0005588920 python3.9[138294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:06 np0005588920 python3.9[138415]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918085.2344506-508-65316232774344/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:06.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:07 np0005588920 python3.9[138566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:07 np0005588920 python3.9[138687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918086.5839334-508-199567269337433/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:08.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:08.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:08 np0005588920 python3.9[138837]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:08:09 np0005588920 python3.9[138992]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:10 np0005588920 python3.9[139144]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:10.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:10.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:10 np0005588920 python3.9[139222]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:11 np0005588920 python3.9[139375]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:11 np0005588920 python3.9[139453]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:12.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:12.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:12 np0005588920 python3.9[139605]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:13 np0005588920 python3.9[139758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:14 np0005588920 python3.9[139836]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:14.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:14.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:14 np0005588920 python3.9[140038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:15 np0005588920 python3.9[140117]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:16 np0005588920 python3.9[140269]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:08:16 np0005588920 systemd[1]: Reloading.
Jan 20 09:08:16 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:08:16 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:08:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:16.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:16.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:17 np0005588920 python3.9[140459]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:18 np0005588920 python3.9[140537]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:18.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:18.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:18 np0005588920 python3.9[140689]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:19 np0005588920 python3.9[140768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:20.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:20.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:20 np0005588920 python3.9[140920]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:08:20 np0005588920 systemd[1]: Reloading.
Jan 20 09:08:20 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:08:20 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:08:20 np0005588920 systemd[1]: Starting Create netns directory...
Jan 20 09:08:20 np0005588920 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:08:20 np0005588920 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:08:20 np0005588920 systemd[1]: Finished Create netns directory.
Jan 20 09:08:21 np0005588920 python3.9[141116]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:22.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:22.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:22 np0005588920 python3.9[141268]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:23 np0005588920 python3.9[141392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918102.2696187-961-181632132788120/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:08:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 4964 writes, 21K keys, 4964 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 4964 writes, 706 syncs, 7.03 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4964 writes, 21K keys, 4964 commit groups, 1.0 writes per commit group, ingest: 18.04 MB, 0.03 MB/s#012Interval WAL: 4964 writes, 706 syncs, 7.03 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 20 09:08:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:24.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:24.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:24 np0005588920 python3.9[141544]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:25 np0005588920 python3.9[141697]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:26 np0005588920 python3.9[141849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:26.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:26.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:26 np0005588920 python3.9[141972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918105.744313-1060-234702126466491/.source.json _original_basename=.xibosmyp follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:27 np0005588920 python3.9[142123]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:28.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:28.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:30 np0005588920 python3.9[142547]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 20 09:08:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:30.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:30.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:31 np0005588920 python3.9[142700]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:08:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:32.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:32.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:32 np0005588920 podman[142824]: 2026-01-20 14:08:32.531303905 +0000 UTC m=+0.108743767 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:08:32 np0005588920 python3[142858]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:08:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 20 09:08:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:34.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 20 09:08:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:34.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:36.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:36.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 20 09:08:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:38.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 20 09:08:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:38.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:40.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:40.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:41 np0005588920 podman[142893]: 2026-01-20 14:08:41.169726299 +0000 UTC m=+8.402458343 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:41 np0005588920 podman[143079]: 2026-01-20 14:08:41.39049111 +0000 UTC m=+0.089259054 container create 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:08:41 np0005588920 podman[143079]: 2026-01-20 14:08:41.347089508 +0000 UTC m=+0.045857452 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:41 np0005588920 python3[142858]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 20 09:08:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:42.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 20 09:08:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:42.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:44.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:44.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:46.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:46.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:48.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:48.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:48 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 09:08:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:50.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:50.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:51 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 09:08:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:52.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:52.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:54 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 09:08:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:54.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 20 09:08:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:54.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 20 09:08:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:08:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:56.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:56.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:08:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:58.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:08:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:08:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:58.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:59 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:08:59 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 20 09:08:59 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 20 09:09:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:00.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:00.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:09:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:02.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:09:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:02.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:03 np0005588920 podman[143336]: 2026-01-20 14:09:03.062722451 +0000 UTC m=+0.139254933 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:09:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:04.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:09:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:09:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:06.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:07 np0005588920 python3.9[143493]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:09:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 20 09:09:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:08.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 20 09:09:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:08.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:09 np0005588920 python3.9[143648]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:10 np0005588920 python3.9[143724]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:09:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:10.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:10.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:11 np0005588920 python3.9[143876]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918150.2823334-1294-97529106875727/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:11 np0005588920 python3.9[143952]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:09:11 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:12 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:12 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:12.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:12.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:13 np0005588920 python3.9[144064]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:13 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:14 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:14 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:14 np0005588920 systemd[1]: Starting ovn_metadata_agent container...
Jan 20 09:09:14 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:09:14 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d9b1f8e8b0d993d23bb873cf37dab307eeb1396759c5eed6052d9c6ad67ccc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:14 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d9b1f8e8b0d993d23bb873cf37dab307eeb1396759c5eed6052d9c6ad67ccc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:14 np0005588920 systemd[1]: Started /usr/bin/podman healthcheck run 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882.
Jan 20 09:09:14 np0005588920 podman[144108]: 2026-01-20 14:09:14.514700787 +0000 UTC m=+0.199444020 container init 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + sudo -E kolla_set_configs
Jan 20 09:09:14 np0005588920 podman[144108]: 2026-01-20 14:09:14.547779147 +0000 UTC m=+0.232522300 container start 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 20 09:09:14 np0005588920 edpm-start-podman-container[144108]: ovn_metadata_agent
Jan 20 09:09:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:14.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:14.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Validating config file
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Copying service configuration files
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Writing out command to execute
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 20 09:09:14 np0005588920 edpm-start-podman-container[144107]: Creating additional drop-in dependency for "ovn_metadata_agent" (04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882)
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: ++ cat /run_command
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + CMD=neutron-ovn-metadata-agent
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + ARGS=
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + sudo kolla_copy_cacerts
Jan 20 09:09:14 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:14 np0005588920 podman[144129]: 2026-01-20 14:09:14.655980097 +0000 UTC m=+0.091390044 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: Running command: 'neutron-ovn-metadata-agent'
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + [[ ! -n '' ]]
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + . kolla_extend_start
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + umask 0022
Jan 20 09:09:14 np0005588920 ovn_metadata_agent[144123]: + exec neutron-ovn-metadata-agent
Jan 20 09:09:14 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:14 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:14 np0005588920 systemd[1]: Started ovn_metadata_agent container.
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.377 144128 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.377 144128 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.377 144128 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.377 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.377 144128 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.378 144128 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.379 144128 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.380 144128 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.381 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.382 144128 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.383 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.384 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.385 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.386 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.387 144128 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.388 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.389 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.390 144128 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.391 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.392 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.393 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.394 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.395 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.396 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.397 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.398 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.399 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.400 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.401 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.402 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.403 144128 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.404 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.405 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.406 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.407 144128 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.415 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.415 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.416 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.416 144128 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.416 144128 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.429 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 7c9bfe4c-7684-437c-a64a-33562743d048 (UUID: 7c9bfe4c-7684-437c-a64a-33562743d048) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.454 144128 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.454 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.454 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.454 144128 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.457 144128 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.462 144128 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.468 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '7c9bfe4c-7684-437c-a64a-33562743d048'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], external_ids={}, name=7c9bfe4c-7684-437c-a64a-33562743d048, nb_cfg_timestamp=1768918059918, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.468 144128 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f79259a2f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.470 144128 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.473 144128 DEBUG oslo_service.service [-] Started child 144287 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.476 144128 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8z_stpra/privsep.sock']#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.479 144287 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-361183'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.514 144287 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.515 144287 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.515 144287 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.523 144287 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.535 144287 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 20 09:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:16.544 144287 INFO eventlet.wsgi.server [-] (144287) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 20 09:09:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:16.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:17 np0005588920 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 20 09:09:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.221 144128 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.222 144128 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8z_stpra/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.079 144293 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.086 144293 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.090 144293 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.090 144293 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144293#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.226 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[35c300d5-fb2f-4d83-b9a7-7dd94f94c08c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.691 144293 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.691 144293 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:09:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:17.691 144293 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:09:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:18.202 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[cd445623-9879-4008-8e27-3ecae5c342b6]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:09:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:18.205 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, column=external_ids, values=({'neutron:ovn-metadata-id': 'bf61acc1-e0aa-5616-834b-73a3fdc188dd'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:09:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:18.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:18.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:18 np0005588920 ceph-mds[83715]: mds.beacon.cephfs.compute-2.jyxktq missed beacon ack from the monitors
Jan 20 09:09:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:20.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:20.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.217 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.305 144128 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.305 144128 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.305 144128 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.305 144128 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.306 144128 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.307 144128 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.308 144128 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.309 144128 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.310 144128 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.310 144128 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.310 144128 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.310 144128 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.310 144128 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.311 144128 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.312 144128 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.313 144128 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.314 144128 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.315 144128 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.316 144128 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.317 144128 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.318 144128 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.319 144128 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.320 144128 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.321 144128 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.322 144128 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.323 144128 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.324 144128 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.325 144128 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.326 144128 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.327 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.328 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.328 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.328 144128 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.328 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.328 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.329 144128 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.330 144128 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.331 144128 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.331 144128 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.331 144128 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.331 144128 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.331 144128 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.332 144128 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.333 144128 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.333 144128 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.333 144128 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.333 144128 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.333 144128 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.334 144128 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.335 144128 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.336 144128 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.337 144128 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.338 144128 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.339 144128 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.340 144128 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.341 144128 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.342 144128 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.342 144128 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.342 144128 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.342 144128 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.342 144128 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.343 144128 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.344 144128 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.345 144128 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.346 144128 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.347 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.348 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.349 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.350 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.351 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.352 144128 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:09:21.353 144128 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:09:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:22.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:22.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:22 np0005588920 ceph-mds[83715]: mds.beacon.cephfs.compute-2.jyxktq missed beacon ack from the monitors
Jan 20 09:09:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).paxos(paxos active c 754..1350) lease_timeout -- calling new election
Jan 20 09:09:23 np0005588920 ceph-mon[77148]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 20 09:09:23 np0005588920 ceph-mon[77148]: paxos.1).electionLogic(14) init, last seen epoch 14
Jan 20 09:09:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:24.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:24.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:26.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:26.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:26 np0005588920 ceph-mds[83715]: mds.beacon.cephfs.compute-2.jyxktq missed beacon ack from the monitors
Jan 20 09:09:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:27.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:28.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:28.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON: #012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1514, in _run_cephadm_json#012    return json.loads(''.join(out))#012  File "/lib64/python3.9/json/__init__.py", line 346, in loads#012    return _default_decoder.decode(s)#012  File "/lib64/python3.9/json/decoder.py", line 337, in decode#012    obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012  File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012    raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: Failed to apply osd.default_drive_group spec DriveGroupSpec.from_json(yaml.safe_load('''service_type: osd#012service_id: default_drive_group#012service_name: osd.default_drive_group#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012spec:#012  data_devices:#012    paths:#012    - /dev/ceph_vg0/ceph_lv0#012  filter_logic: AND#012  objectstore: bluestore#012''')): host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON#012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1514, in _run_cephadm_json#012    return json.loads(''.join(out))#012  File "/lib64/python3.9/json/__init__.py", line 346, in loads#012    return _default_decoder.decode(s)#012  File "/lib64/python3.9/json/decoder.py", line 337, in decode#012    obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012  File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012    raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)#012#012During handling of the above exception, another exception occurred:#012#012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 577, in _apply_all_services#012    if self._apply_service(spec):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 696, in _apply_service#012    self.mgr.osd_service.create_from_spec(cast(DriveGroupSpec, spec))#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 79, in create_from_spec#012    ret = self.mgr.wait_async(all_hosts())#012  File "/usr/share/ceph/mgr/cephadm/module.py", line 735, in wait_async#012    return self.event_loop.get_result(coro, timeout)#012  File "/usr/share/ceph/mgr/cephadm/ssh.py", line 64, in get_result#012    return future.result(timeout)#012  File "/lib64/python3.9/concurrent/futures/_base.py", line 446, in result#012    return self.__get_result()#012  File "/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result#012    raise self._exception#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 76, in all_hosts#012    return await gather(*futures)#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 63, in create_from_spec_one#012    ret_msg = await self.create_single_host(#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 98, in create_single_host#012    return await self.deploy_osd_daemons_for_existing_osds(host, drive_group,#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 158, in deploy_osd_daemons_for_existing_osds#012    raw_elems: dict = await CephadmServe(self.mgr)._run_cephadm_json(#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1518, in _run_cephadm_json#012    raise OrchestratorError(msg)#012orchestrator._interface.OrchestratorError: host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: Deploying daemon haproxy.rgw.default.compute-1.uyeocq on compute-1
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: mon.compute-1 calling monitor election
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: mon.compute-2 calling monitor election
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: mon.compute-0 calling monitor election
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 09:09:29 np0005588920 ceph-mon[77148]: Deploying daemon keepalived.rgw.default.compute-1.cevitz on compute-1
Jan 20 09:09:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:29.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:30 np0005588920 python3.9[144429]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 20 09:09:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:30.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:30 np0005588920 ceph-mon[77148]: Health check failed: Failed to apply 1 service(s): osd.default_drive_group (CEPHADM_APPLY_SPEC_FAIL)
Jan 20 09:09:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:30.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:31 np0005588920 python3.9[144582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:09:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:31.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:32 np0005588920 python3.9[144707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918170.7991056-1429-196558024514099/.source.yaml _original_basename=.gjir3juj follow=False checksum=cdeb45300f793bd9e5b2caee7d44d83f067a1a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:32 np0005588920 systemd-logind[783]: Session 47 logged out. Waiting for processes to exit.
Jan 20 09:09:32 np0005588920 systemd[1]: session-47.scope: Deactivated successfully.
Jan 20 09:09:32 np0005588920 systemd[1]: session-47.scope: Consumed 1min 3.294s CPU time.
Jan 20 09:09:32 np0005588920 systemd-logind[783]: Removed session 47.
Jan 20 09:09:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:32.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:32.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:33 np0005588920 podman[144757]: 2026-01-20 14:09:33.726653418 +0000 UTC m=+0.123931860 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:09:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:34.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:34.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:35.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:36.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:36.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:37.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:38 np0005588920 systemd-logind[783]: New session 48 of user zuul.
Jan 20 09:09:38 np0005588920 systemd[1]: Started Session 48 of User zuul.
Jan 20 09:09:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:38.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:38.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:38 np0005588920 ceph-mon[77148]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 1 service(s): osd.default_drive_group)
Jan 20 09:09:38 np0005588920 ceph-mon[77148]: Cluster is now healthy
Jan 20 09:09:39 np0005588920 python3.9[145015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:09:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:39.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:40.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:40.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:40 np0005588920 python3.9[145171]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:09:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:41.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:42 np0005588920 python3.9[145337]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:09:42 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:42 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:42 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:42.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:42.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:43 np0005588920 python3.9[145523]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:09:43 np0005588920 network[145583]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:09:43 np0005588920 network[145587]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:09:43 np0005588920 network[145590]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:09:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:43.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:44.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:44.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:44 np0005588920 ceph-mon[77148]: Removing daemon haproxy.rgw.default.compute-2.cuokcs from compute-2 -- ports [8080, 8999]
Jan 20 09:09:44 np0005588920 systemd[1]: Stopping Ceph haproxy.rgw.default.compute-2.cuokcs for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:09:45 np0005588920 podman[145740]: 2026-01-20 14:09:45.036785694 +0000 UTC m=+0.062262944 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:09:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [WARNING] 019/140945 (2) : Exiting Master process...
Jan 20 09:09:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [NOTICE] 019/140945 (2) : haproxy version is 2.3.17-d1c9119
Jan 20 09:09:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [NOTICE] 019/140945 (2) : path to executable is /usr/local/sbin/haproxy
Jan 20 09:09:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [ALERT] 019/140945 (2) : Current worker #1 (4) exited with code 143 (Terminated)
Jan 20 09:09:45 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs[84702]: [WARNING] 019/140945 (2) : All workers exited. Exiting... (0)
Jan 20 09:09:45 np0005588920 podman[145746]: 2026-01-20 14:09:45.060467395 +0000 UTC m=+0.066086729 container died c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:09:45 np0005588920 systemd[1]: var-lib-containers-storage-overlay-1d2750c2b0a0e3b8fd12ff199ed0b87441ae77163d5fee9f3f72c9fcae817670-merged.mount: Deactivated successfully.
Jan 20 09:09:45 np0005588920 podman[145746]: 2026-01-20 14:09:45.124209538 +0000 UTC m=+0.129828882 container remove c2bc03da81fcaa831434668625c7c49511adebaebe9f2026fa9fc7c3d873fb54 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs)
Jan 20 09:09:45 np0005588920 bash[145746]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-2-cuokcs
Jan 20 09:09:45 np0005588920 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@haproxy.rgw.default.compute-2.cuokcs.service: Deactivated successfully.
Jan 20 09:09:45 np0005588920 systemd[1]: Stopped Ceph haproxy.rgw.default.compute-2.cuokcs for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:09:45 np0005588920 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@haproxy.rgw.default.compute-2.cuokcs.service: Consumed 2.159s CPU time.
Jan 20 09:09:45 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:45 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:45 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth rm", "entity": "client.ingress.rgw.default.compute-2.cuokcs"}]: dispatch
Jan 20 09:09:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:46 np0005588920 systemd[1]: Stopping Ceph keepalived.rgw.default.compute-2.dleeql for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:09:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000055s ======
Jan 20 09:09:46 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 14:09:46 2026: Stopping
Jan 20 09:09:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:46.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 20 09:09:46 np0005588920 ceph-mon[77148]: Removing key for client.ingress.rgw.default.compute-2.cuokcs
Jan 20 09:09:46 np0005588920 ceph-mon[77148]: Removing daemon keepalived.rgw.default.compute-2.dleeql from compute-2 -- ports []
Jan 20 09:09:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:47 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 14:09:47 2026: Stopped
Jan 20 09:09:47 np0005588920 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql[85121]: Tue Jan 20 14:09:47 2026: Stopped Keepalived v2.2.4 (08/21,2021)
Jan 20 09:09:47 np0005588920 podman[146081]: 2026-01-20 14:09:47.650865917 +0000 UTC m=+1.060090380 container died 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, io.buildah.version=1.28.2, version=2.2.4, release=1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, name=keepalived, vcs-type=git, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 20 09:09:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay-da163f20f86839d019f57876cc1fcfeca512d1d6ec19fb725d5dbfa41da48983-merged.mount: Deactivated successfully.
Jan 20 09:09:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:47.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:47 np0005588920 podman[146081]: 2026-01-20 14:09:47.69858626 +0000 UTC m=+1.107810723 container remove 4f78941c670aecb446a8bef016d44f9adf56eddc01c85813d46adbff35b13482 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql, vcs-type=git, version=2.2.4, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, architecture=x86_64)
Jan 20 09:09:47 np0005588920 bash[146081]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-2-dleeql
Jan 20 09:09:47 np0005588920 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@keepalived.rgw.default.compute-2.dleeql.service: Deactivated successfully.
Jan 20 09:09:47 np0005588920 systemd[1]: Stopped Ceph keepalived.rgw.default.compute-2.dleeql for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:09:47 np0005588920 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@keepalived.rgw.default.compute-2.dleeql.service: Consumed 4.573s CPU time.
Jan 20 09:09:47 np0005588920 systemd[1]: Reloading.
Jan 20 09:09:48 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:48 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:48.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:48 np0005588920 python3.9[146359]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth rm", "entity": "client.ingress.rgw.default.compute-2.dleeql"}]: dispatch
Jan 20 09:09:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:49 np0005588920 python3.9[146664]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:49.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:49 np0005588920 podman[146737]: 2026-01-20 14:09:49.807738455 +0000 UTC m=+0.085655427 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 20 09:09:49 np0005588920 ceph-mon[77148]: Removing key for client.ingress.rgw.default.compute-2.dleeql
Jan 20 09:09:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:49 np0005588920 podman[146737]: 2026-01-20 14:09:49.9188243 +0000 UTC m=+0.196741302 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:09:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:50.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.361279) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191361381, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1537, "num_deletes": 257, "total_data_size": 3514014, "memory_usage": 3581184, "flush_reason": "Manual Compaction"}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191382010, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2252287, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12243, "largest_seqno": 13774, "table_properties": {"data_size": 2245739, "index_size": 3683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13582, "raw_average_key_size": 18, "raw_value_size": 2232168, "raw_average_value_size": 3121, "num_data_blocks": 166, "num_entries": 715, "num_filter_entries": 715, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918051, "oldest_key_time": 1768918051, "file_creation_time": 1768918191, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 20828 microseconds, and 5266 cpu microseconds.
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.382106) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2252287 bytes OK
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.382145) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.384772) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.384784) EVENT_LOG_v1 {"time_micros": 1768918191384781, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.384798) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3506790, prev total WAL file size 3506790, number of live WAL files 2.
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.385719) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323537' seq:0, type:0; will stop at (end)
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(2199KB)], [24(8051KB)]
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191385832, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 10497497, "oldest_snapshot_seqno": -1}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4136 keys, 9909736 bytes, temperature: kUnknown
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191486646, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9909736, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9878052, "index_size": 20246, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 101784, "raw_average_key_size": 24, "raw_value_size": 9799262, "raw_average_value_size": 2369, "num_data_blocks": 859, "num_entries": 4136, "num_filter_entries": 4136, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918191, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.486947) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9909736 bytes
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.488769) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 104.0 rd, 98.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 7.9 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(9.1) write-amplify(4.4) OK, records in: 4673, records dropped: 537 output_compression: NoCompression
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.488799) EVENT_LOG_v1 {"time_micros": 1768918191488784, "job": 12, "event": "compaction_finished", "compaction_time_micros": 100893, "compaction_time_cpu_micros": 38934, "output_level": 6, "num_output_files": 1, "total_output_size": 9909736, "num_input_records": 4673, "num_output_records": 4136, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191489600, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191492006, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.385607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.492052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.492059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.492062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.492065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:09:51.492068) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588920 python3.9[147124]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:51.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:52 np0005588920 python3.9[147291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:09:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:52.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:53 np0005588920 python3.9[147444]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:53.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:54 np0005588920 python3.9[147597]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:54.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:55 np0005588920 python3.9[147750]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:55.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:56 np0005588920 python3.9[147903]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:56.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:57 np0005588920 python3.9[148055]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:57 np0005588920 python3.9[148207]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:57.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:58 np0005588920 python3.9[148359]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:58.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:59 np0005588920 python3.9[148511]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:09:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:09:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:59.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:09:59 np0005588920 python3.9[148713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:10:00 np0005588920 python3.9[148865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:00.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:01 np0005588920 ceph-mon[77148]: Reconfiguring keepalived.rgw.default.compute-0.gcjsxe (dependencies changed)...
Jan 20 09:10:01 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:10:01 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:10:01 np0005588920 ceph-mon[77148]: Reconfiguring daemon keepalived.rgw.default.compute-0.gcjsxe on compute-0
Jan 20 09:10:01 np0005588920 python3.9[149017]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:01.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:02 np0005588920 python3.9[149169]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:02.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:03 np0005588920 python3.9[149321]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: Reconfiguring keepalived.rgw.default.compute-1.cevitz (dependencies changed)...
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:10:03 np0005588920 ceph-mon[77148]: Reconfiguring daemon keepalived.rgw.default.compute-1.cevitz on compute-1
Jan 20 09:10:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:03.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:03 np0005588920 python3.9[149473]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:04 np0005588920 podman[149475]: 2026-01-20 14:10:04.013873469 +0000 UTC m=+0.093993966 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 09:10:04 np0005588920 python3.9[149652]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:04.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:05 np0005588920 python3.9[149804]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:05.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:06 np0005588920 python3.9[150056]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:06 np0005588920 podman[150154]: 2026-01-20 14:10:06.358087679 +0000 UTC m=+0.076301300 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 20 09:10:06 np0005588920 podman[150154]: 2026-01-20 14:10:06.479507219 +0000 UTC m=+0.197720720 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:10:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:06.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:07 np0005588920 python3.9[150388]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:07.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:08 np0005588920 python3.9[150555]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:10:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:08.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:10:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:10:09 np0005588920 python3.9[150707]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:10:09 np0005588920 systemd[1]: Reloading.
Jan 20 09:10:09 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:10:09 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:10:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:09.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:10 np0005588920 python3.9[150894]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:10.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:11 np0005588920 python3.9[151047]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:11.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:12 np0005588920 python3.9[151200]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:12.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:13 np0005588920 python3.9[151353]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:14 np0005588920 python3.9[151506]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:14.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:15 np0005588920 podman[151631]: 2026-01-20 14:10:15.269169508 +0000 UTC m=+0.104500335 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:10:15 np0005588920 python3.9[151675]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:10:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:15.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:10:16 np0005588920 python3.9[151830]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:10:16.409 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:10:16.410 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:10:16.410 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:10:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:16.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:17.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:17 np0005588920 python3.9[152033]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 20 09:10:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:18.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:18 np0005588920 python3.9[152186]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:10:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:19.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:20.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:21 np0005588920 python3.9[152344]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 09:10:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:21.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:22 np0005588920 python3.9[152504]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:10:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:22.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:23 np0005588920 python3.9[152588]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:10:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:23.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:24.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:25.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:26.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:27.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:28.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:29.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:30.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:10:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:31.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:10:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:32.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:33.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:34.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:35 np0005588920 podman[152731]: 2026-01-20 14:10:35.096954935 +0000 UTC m=+0.150084453 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:10:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:35.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:36.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:37.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:38.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:39.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:40.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:41.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:42.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:43.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:10:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:44.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:10:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:45.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:46 np0005588920 podman[152805]: 2026-01-20 14:10:46.004073855 +0000 UTC m=+0.079215257 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 09:10:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:47.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:48.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:49.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:49 np0005588920 kernel: SELinux:  Converting 2776 SID table entries...
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:10:50 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:10:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:51.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:53.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:54.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:55.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:56.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:57.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:58.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:58 np0005588920 kernel: SELinux:  Converting 2776 SID table entries...
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:10:58 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:10:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:10:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:59.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:00.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:01.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:02.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:03.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:04.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:05.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:05 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 20 09:11:06 np0005588920 podman[152839]: 2026-01-20 14:11:06.071791526 +0000 UTC m=+0.132889328 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:11:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:11:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:06.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:11:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:07.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:11:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:08.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:11:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:09.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:10.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:11.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:12.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:13.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:14.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:15.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:11:16.411 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:11:16.412 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:11:16.412 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:11:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:16.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:17 np0005588920 podman[156159]: 2026-01-20 14:11:17.076237772 +0000 UTC m=+0.150851674 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:11:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:17.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:18.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:11:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:19.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:11:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:20.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:11:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:21.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:11:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:22.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:23.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:24.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:25.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:26.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:27.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:28.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:29.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:30.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:31.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:32.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:33.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:34.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:35.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:36.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:37 np0005588920 podman[166231]: 2026-01-20 14:11:37.084745343 +0000 UTC m=+0.149567462 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:11:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:37.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:38.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:39.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:40.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:41.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:42.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:43.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:44.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:45.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:46.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:47.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:48 np0005588920 podman[169962]: 2026-01-20 14:11:48.013890966 +0000 UTC m=+0.093401492 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 09:11:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:48.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:49.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:50.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:51.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:52.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:53.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:54.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:55.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:56 np0005588920 kernel: SELinux:  Converting 2777 SID table entries...
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:11:56 np0005588920 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:11:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:56.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:57 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 09:11:57 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 20 09:11:57 np0005588920 dbus-broker-launch[735]: Noticed file-system modification, trigger reload.
Jan 20 09:11:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:57.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:58.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:11:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:59.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:00.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:01.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:02.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:03.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:04.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:05 np0005588920 systemd[1]: Stopping OpenSSH server daemon...
Jan 20 09:12:05 np0005588920 systemd[1]: sshd.service: Deactivated successfully.
Jan 20 09:12:05 np0005588920 systemd[1]: Stopped OpenSSH server daemon.
Jan 20 09:12:05 np0005588920 systemd[1]: sshd.service: Consumed 6.569s CPU time, read 564.0K from disk, written 136.0K to disk.
Jan 20 09:12:05 np0005588920 systemd[1]: Stopped target sshd-keygen.target.
Jan 20 09:12:05 np0005588920 systemd[1]: Stopping sshd-keygen.target...
Jan 20 09:12:05 np0005588920 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:05 np0005588920 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:05 np0005588920 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:05 np0005588920 systemd[1]: Reached target sshd-keygen.target.
Jan 20 09:12:05 np0005588920 systemd[1]: Starting OpenSSH server daemon...
Jan 20 09:12:05 np0005588920 systemd[1]: Started OpenSSH server daemon.
Jan 20 09:12:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:05.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:06.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:07 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:12:07 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:12:07 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:07 np0005588920 podman[171034]: 2026-01-20 14:12:07.298100628 +0000 UTC m=+0.104392319 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:12:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:07 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:12:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:07.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:08.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:10.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:11.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:12.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:13.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:14.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:15.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:12:16.412 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:12:16.414 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:12:16.415 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:12:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:16.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:17 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:12:17 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:12:17 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 12.959s CPU time.
Jan 20 09:12:17 np0005588920 systemd[1]: run-ra83ff35aca9d47a291c4ff28de856eaf.service: Deactivated successfully.
Jan 20 09:12:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:17.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:18.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:19 np0005588920 podman[179521]: 2026-01-20 14:12:19.027361174 +0000 UTC m=+0.097855366 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:12:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:19.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:20.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:21.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:22.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:23.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:24.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:25.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:26.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:27 np0005588920 python3.9[179667]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:27 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:27 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:27 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:27.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:28 np0005588920 python3.9[179857]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:28 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:28.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:29 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:29 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:29.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:30 np0005588920 python3.9[180048]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:30 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:30 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:30 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:30.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:31 np0005588920 python3.9[180238]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:31 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:31 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:31 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:31.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:32 np0005588920 python3.9[180527]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:32 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:32 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:32 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:32.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:33.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:34 np0005588920 python3.9[180750]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:34 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:34 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:34 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:12:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:34.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:35 np0005588920 python3.9[180940]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:35 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:12:35 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:35 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:35.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:36 np0005588920 python3.9[181130]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:36.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:37 np0005588920 python3.9[181285]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:37 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:37 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:37 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:37.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:37 np0005588920 podman[181325]: 2026-01-20 14:12:37.982836594 +0000 UTC m=+0.111695450 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:12:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:38.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:39 np0005588920 python3.9[181502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:39.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:39 np0005588920 systemd[1]: Reloading.
Jan 20 09:12:40 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:40 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:40 np0005588920 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 20 09:12:40 np0005588920 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 20 09:12:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:40.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:41 np0005588920 python3.9[181696]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:41.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:42 np0005588920 python3.9[181851]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:12:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:12:43 np0005588920 python3.9[182006]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:12:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:43.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:12:44 np0005588920 python3.9[182161]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:44.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:45 np0005588920 python3.9[182316]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:45.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:46 np0005588920 python3.9[182521]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:46.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:48 np0005588920 python3.9[182676]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:48.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:49 np0005588920 podman[182803]: 2026-01-20 14:12:49.237094466 +0000 UTC m=+0.070030474 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 09:12:49 np0005588920 python3.9[182850]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:50 np0005588920 python3.9[183005]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:50.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:51 np0005588920 python3.9[183160]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:52 np0005588920 python3.9[183315]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:52.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:53 np0005588920 python3.9[183470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:53.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:54 np0005588920 python3.9[183625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:55 np0005588920 python3.9[183780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:55.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:56.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:57.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:58 np0005588920 python3.9[183935]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:12:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:59 np0005588920 python3.9[184087]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:12:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:12:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:12:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:59.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:00 np0005588920 python3.9[184239]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:00 np0005588920 python3.9[184391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:01 np0005588920 python3.9[184543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:02 np0005588920 python3.9[184695]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:02.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:03 np0005588920 python3.9[184845]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:13:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:03.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:04 np0005588920 python3.9[184997]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:04 np0005588920 python3.9[185122]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918383.3878536-1648-196075633798373/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:04.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:05 np0005588920 python3.9[185274]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:05.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:06 np0005588920 python3.9[185399]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918385.0695283-1648-51920136138253/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:06.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:07 np0005588920 python3.9[185551]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:07 np0005588920 python3.9[185676]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918386.451065-1648-159389069815755/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:08 np0005588920 podman[185800]: 2026-01-20 14:13:08.450044274 +0000 UTC m=+0.141115446 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:13:08 np0005588920 python3.9[185845]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:08.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:09 np0005588920 python3.9[185980]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918387.9594429-1648-131469945810847/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:10 np0005588920 python3.9[186132]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:10 np0005588920 python3.9[186257]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918389.4700718-1648-250373101602937/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:10.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:11 np0005588920 python3.9[186409]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:12.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:12 np0005588920 python3.9[186534]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918391.16113-1648-249932476419680/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:12.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:13 np0005588920 python3.9[186686]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:13 np0005588920 python3.9[186809]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918392.6656885-1648-221357926706526/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:14 np0005588920 python3.9[186961]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:14.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:15 np0005588920 python3.9[187086]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918394.0201516-1648-116908602652154/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:16.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:16 np0005588920 python3.9[187238]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 20 09:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:13:16.414 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:13:16.414 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:13:16.415 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:13:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:16.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:17 np0005588920 python3.9[187391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:17 np0005588920 python3.9[187543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:18.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:18 np0005588920 python3.9[187695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:18.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:19 np0005588920 python3.9[187847]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:19 np0005588920 podman[187971]: 2026-01-20 14:13:19.958143108 +0000 UTC m=+0.067392041 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible)
Jan 20 09:13:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:20.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:20 np0005588920 python3.9[188018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:20 np0005588920 python3.9[188171]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:20.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:21 np0005588920 python3.9[188323]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:22.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:22 np0005588920 python3.9[188475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:22.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:23 np0005588920 python3.9[188627]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:24 np0005588920 python3.9[188779]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:24.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:24 np0005588920 python3.9[188931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:24.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:25 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:13:25 np0005588920 python3.9[189083]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:26.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:26 np0005588920 python3.9[189235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:27 np0005588920 python3.9[189387]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:28.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:28 np0005588920 python3.9[189539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:28 np0005588920 python3.9[189662]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918407.5693285-2311-177055067364939/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:28.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:29 np0005588920 python3.9[189814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:30 np0005588920 python3.9[189937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918409.1174793-2311-116536531851150/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:30 np0005588920 auditd[698]: Audit daemon rotating log files
Jan 20 09:13:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:31.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:31 np0005588920 python3.9[190089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:31 np0005588920 python3.9[190212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918410.5065687-2311-137416382193658/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:32.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:32 np0005588920 python3.9[190364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:33.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:33 np0005588920 python3.9[190487]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918411.8933556-2311-115363179205269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:34.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:34 np0005588920 python3.9[190639]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:34 np0005588920 python3.9[190762]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918413.2307012-2311-167075273818153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:35.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:36 np0005588920 python3.9[190914]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:36 np0005588920 python3.9[191037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918415.1313062-2311-162967215902041/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:37.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:37.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:38 np0005588920 python3.9[191189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:38 np0005588920 podman[191284]: 2026-01-20 14:13:38.729597437 +0000 UTC m=+0.128747122 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:13:38 np0005588920 python3.9[191331]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918416.998861-2311-96575417468957/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:39.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:39 np0005588920 python3.9[191491]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:40 np0005588920 python3.9[191614]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918418.9872477-2311-226164578309879/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:40 np0005588920 python3.9[191766]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:41.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:41.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:41 np0005588920 python3.9[191889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918420.2615476-2311-99473387413617/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:42 np0005588920 python3.9[192041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:42 np0005588920 python3.9[192164]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918421.5560725-2311-172319395073721/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:43.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:43.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:43 np0005588920 python3.9[192316]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:44 np0005588920 python3.9[192439]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918423.0168948-2311-175388649339920/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.605837) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424605931, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2646, "num_deletes": 501, "total_data_size": 6152794, "memory_usage": 6229800, "flush_reason": "Manual Compaction"}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424633123, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 2361262, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13779, "largest_seqno": 16420, "table_properties": {"data_size": 2353696, "index_size": 3740, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 21565, "raw_average_key_size": 19, "raw_value_size": 2335088, "raw_average_value_size": 2130, "num_data_blocks": 169, "num_entries": 1096, "num_filter_entries": 1096, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918191, "oldest_key_time": 1768918191, "file_creation_time": 1768918424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 27325 microseconds, and 12276 cpu microseconds.
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.633163) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 2361262 bytes OK
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.633179) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.634724) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.634738) EVENT_LOG_v1 {"time_micros": 1768918424634735, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.634753) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 6140455, prev total WAL file size 6140455, number of live WAL files 2.
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.636320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(2305KB)], [27(9677KB)]
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424636416, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 12270998, "oldest_snapshot_seqno": -1}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4319 keys, 8320780 bytes, temperature: kUnknown
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424716852, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 8320780, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8290559, "index_size": 18335, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10821, "raw_key_size": 106665, "raw_average_key_size": 24, "raw_value_size": 8211065, "raw_average_value_size": 1901, "num_data_blocks": 773, "num_entries": 4319, "num_filter_entries": 4319, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.717085) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 8320780 bytes
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.718337) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.4 rd, 103.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 9.5 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(8.7) write-amplify(3.5) OK, records in: 5232, records dropped: 913 output_compression: NoCompression
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.718354) EVENT_LOG_v1 {"time_micros": 1768918424718345, "job": 14, "event": "compaction_finished", "compaction_time_micros": 80507, "compaction_time_cpu_micros": 38857, "output_level": 6, "num_output_files": 1, "total_output_size": 8320780, "num_input_records": 5232, "num_output_records": 4319, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424718845, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424720438, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.636177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.720538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.720546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.720550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.720554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:13:44.720558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588920 python3.9[192591]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:45.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:45.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:45 np0005588920 python3.9[192714]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918424.312467-2311-6428980685515/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:46 np0005588920 python3.9[192866]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:46 np0005588920 python3.9[193064]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918425.6843917-2311-148897786890612/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:47.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:47.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:47 np0005588920 python3.9[193273]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:13:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:13:48 np0005588920 python3.9[193396]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918426.968174-2311-67081584699514/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:49.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:49.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:50 np0005588920 podman[193500]: 2026-01-20 14:13:50.996583476 +0000 UTC m=+0.079631440 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:13:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:51.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:13:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:51.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:13:51 np0005588920 python3.9[193567]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:13:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:52 np0005588920 python3.9[193723]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 20 09:13:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:53.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:53.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:54 np0005588920 dbus-broker-launch[768]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 20 09:13:54 np0005588920 python3.9[193927]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:55.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.003000083s ======
Jan 20 09:13:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:55.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000083s
Jan 20 09:13:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:55 np0005588920 python3.9[194081]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:56 np0005588920 python3.9[194233]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:57.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:57.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:57 np0005588920 python3.9[194385]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:58 np0005588920 python3.9[194537]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:58 np0005588920 python3.9[194689]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:59.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:13:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:59.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:59 np0005588920 python3.9[194841]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:00 np0005588920 python3.9[194993]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:01.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:01.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:01 np0005588920 python3.9[195145]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:02 np0005588920 python3.9[195297]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:02 np0005588920 python3.9[195449]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:02 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:03.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:03 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:03 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:03.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:03 np0005588920 systemd[1]: Starting libvirt logging daemon socket...
Jan 20 09:14:03 np0005588920 systemd[1]: Listening on libvirt logging daemon socket.
Jan 20 09:14:03 np0005588920 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 20 09:14:03 np0005588920 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 20 09:14:03 np0005588920 systemd[1]: Starting libvirt logging daemon...
Jan 20 09:14:03 np0005588920 systemd[1]: Started libvirt logging daemon.
Jan 20 09:14:04 np0005588920 python3.9[195643]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:04 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:04 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:04 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:04 np0005588920 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 20 09:14:04 np0005588920 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 20 09:14:04 np0005588920 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 20 09:14:04 np0005588920 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 20 09:14:04 np0005588920 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 20 09:14:04 np0005588920 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 20 09:14:04 np0005588920 systemd[1]: Starting libvirt nodedev daemon...
Jan 20 09:14:04 np0005588920 systemd[1]: Started libvirt nodedev daemon.
Jan 20 09:14:05 np0005588920 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 20 09:14:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:05.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:05.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:05 np0005588920 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 20 09:14:05 np0005588920 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 20 09:14:05 np0005588920 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 20 09:14:05 np0005588920 python3.9[195860]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:05 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:05 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:05 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:06 np0005588920 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 20 09:14:06 np0005588920 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 20 09:14:06 np0005588920 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 20 09:14:06 np0005588920 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 20 09:14:06 np0005588920 systemd[1]: Starting libvirt proxy daemon...
Jan 20 09:14:06 np0005588920 systemd[1]: Started libvirt proxy daemon.
Jan 20 09:14:06 np0005588920 setroubleshoot[195733]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 42984cbe-9d95-46d9-a38a-ec38b995c39b
Jan 20 09:14:06 np0005588920 setroubleshoot[195733]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 20 09:14:06 np0005588920 setroubleshoot[195733]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 42984cbe-9d95-46d9-a38a-ec38b995c39b
Jan 20 09:14:06 np0005588920 setroubleshoot[195733]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 20 09:14:06 np0005588920 python3.9[196081]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:07 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:07.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:07 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:07 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:07.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:07 np0005588920 systemd[1]: Listening on libvirt locking daemon socket.
Jan 20 09:14:07 np0005588920 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 20 09:14:07 np0005588920 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 20 09:14:07 np0005588920 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 20 09:14:07 np0005588920 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 20 09:14:07 np0005588920 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 20 09:14:07 np0005588920 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 20 09:14:07 np0005588920 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 20 09:14:07 np0005588920 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 20 09:14:07 np0005588920 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 20 09:14:07 np0005588920 systemd[1]: Starting libvirt QEMU daemon...
Jan 20 09:14:07 np0005588920 systemd[1]: Started libvirt QEMU daemon.
Jan 20 09:14:08 np0005588920 python3.9[196296]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:08 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:08 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:08 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:08 np0005588920 systemd[1]: Starting libvirt secret daemon socket...
Jan 20 09:14:08 np0005588920 systemd[1]: Listening on libvirt secret daemon socket.
Jan 20 09:14:08 np0005588920 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 20 09:14:08 np0005588920 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 20 09:14:08 np0005588920 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 20 09:14:08 np0005588920 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 20 09:14:08 np0005588920 systemd[1]: Starting libvirt secret daemon...
Jan 20 09:14:08 np0005588920 systemd[1]: Started libvirt secret daemon.
Jan 20 09:14:08 np0005588920 podman[196334]: 2026-01-20 14:14:08.916686905 +0000 UTC m=+0.116857913 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 20 09:14:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:09.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:09.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:09 np0005588920 python3.9[196534]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:10 np0005588920 python3.9[196686]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:14:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:11.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:11 np0005588920 python3.9[196838]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:12 np0005588920 python3.9[196992]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:14:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:13.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:13.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:13 np0005588920 python3.9[197142]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:13 np0005588920 python3.9[197263]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918452.8194673-3385-222379972668841/.source.xml follow=False _original_basename=secret.xml.j2 checksum=35bbbade4f0995b3fba698d107c82491080dc0dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:14 np0005588920 python3.9[197415]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine e399cf45-e6b6-5393-99f1-75c601d3f188#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:15.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:16 np0005588920 python3.9[197577]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:14:16.415 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:14:16.416 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:14:16.416 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:14:16 np0005588920 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 20 09:14:16 np0005588920 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 20 09:14:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:17.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:17.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:18 np0005588920 python3.9[198040]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:19.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:19.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:19 np0005588920 python3.9[198192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:20 np0005588920 python3.9[198315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918459.204276-3551-45816465202318/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:21.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:21.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:21 np0005588920 podman[198439]: 2026-01-20 14:14:21.232145099 +0000 UTC m=+0.098469223 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:14:21 np0005588920 python3.9[198486]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:22 np0005588920 python3.9[198638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:22 np0005588920 python3.9[198716]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:23.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:23.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:23 np0005588920 python3.9[198868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:24 np0005588920 python3.9[198946]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._op0yunz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:25 np0005588920 python3.9[199098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:25.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:25.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:25 np0005588920 python3.9[199176]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:26 np0005588920 python3.9[199328]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:27.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:27.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:27 np0005588920 python3[199481]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:14:28 np0005588920 python3.9[199633]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:29 np0005588920 python3.9[199711]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:29.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:14:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:14:30 np0005588920 python3.9[199863]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:30 np0005588920 python3.9[199988]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918469.4455833-3818-122214259383226/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:31.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:31.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:31 np0005588920 python3.9[200140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:32 np0005588920 python3.9[200218]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:32 np0005588920 python3.9[200370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:33.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:33.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:33 np0005588920 python3.9[200448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:34 np0005588920 python3.9[200600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:35.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:35 np0005588920 python3.9[200725]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918473.6992464-3934-175737998209541/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:35.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:35 np0005588920 python3.9[200877]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:36 np0005588920 python3.9[201029]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:37.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:37 np0005588920 python3.9[201184]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:38 np0005588920 python3.9[201336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:39.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:39.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:39 np0005588920 podman[201461]: 2026-01-20 14:14:39.655808768 +0000 UTC m=+0.116098320 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 20 09:14:39 np0005588920 python3.9[201509]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:14:40 np0005588920 python3.9[201669]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:41.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:41.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:41 np0005588920 python3.9[201824]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:42 np0005588920 python3.9[201976]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:43.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:43 np0005588920 python3.9[202099]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918481.9721022-4152-103133546637104/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:43.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:44 np0005588920 python3.9[202251]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:44 np0005588920 python3.9[202374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918483.4829493-4196-274914572379543/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:45.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:45.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:45 np0005588920 python3.9[202526]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:46 np0005588920 python3.9[202649]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918484.9636183-4241-151262277899370/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:47.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:47.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:47 np0005588920 python3.9[202801]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:14:47 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:47 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:47 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:47 np0005588920 systemd[1]: Reached target edpm_libvirt.target.
Jan 20 09:14:48 np0005588920 python3.9[202993]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 09:14:48 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:48 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:48 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:49.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:49 np0005588920 systemd[1]: Reloading.
Jan 20 09:14:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:49.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:49 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:49 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:50 np0005588920 systemd[1]: session-48.scope: Deactivated successfully.
Jan 20 09:14:50 np0005588920 systemd[1]: session-48.scope: Consumed 3min 51.113s CPU time.
Jan 20 09:14:50 np0005588920 systemd-logind[783]: Session 48 logged out. Waiting for processes to exit.
Jan 20 09:14:50 np0005588920 systemd-logind[783]: Removed session 48.
Jan 20 09:14:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:51.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:51.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:52 np0005588920 podman[203091]: 2026-01-20 14:14:52.01360665 +0000 UTC m=+0.089929923 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:14:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:53.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:53.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:55.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:55 np0005588920 systemd-logind[783]: New session 49 of user zuul.
Jan 20 09:14:55 np0005588920 systemd[1]: Started Session 49 of User zuul.
Jan 20 09:14:56 np0005588920 python3.9[203395]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:14:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:14:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:14:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:14:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:57.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:57.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:58 np0005588920 python3.9[203549]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:14:58 np0005588920 network[203566]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:14:58 np0005588920 network[203567]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:14:58 np0005588920 network[203568]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:14:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:59.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:14:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:59.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:01.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:03.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:04 np0005588920 python3.9[203842]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:15:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:05.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:05.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:05 np0005588920 python3.9[203976]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:15:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:15:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:15:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:07.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:09.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:09.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:10 np0005588920 podman[203978]: 2026-01-20 14:15:10.143085343 +0000 UTC m=+0.217939798 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:15:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:11.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:11.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:12 np0005588920 python3.9[204155]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:13.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:13 np0005588920 python3.9[204307]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:13.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:14 np0005588920 python3.9[204460]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:15.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:15.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:15 np0005588920 python3.9[204612]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:15:16.417 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:15:16.418 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:15:16.418 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:15:16 np0005588920 python3.9[204765]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:17.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:17.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:17 np0005588920 python3.9[204888]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918515.9363995-248-67213885693211/.source.iscsi _original_basename=.87hj20bk follow=False checksum=6b28cc8bb21631f87a6143220fa74fa9cf181f65 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:18 np0005588920 python3.9[205040]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.004557) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519004696, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1103, "num_deletes": 251, "total_data_size": 2529106, "memory_usage": 2562104, "flush_reason": "Manual Compaction"}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519023351, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1658900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16425, "largest_seqno": 17523, "table_properties": {"data_size": 1653996, "index_size": 2492, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10360, "raw_average_key_size": 19, "raw_value_size": 1644225, "raw_average_value_size": 3090, "num_data_blocks": 113, "num_entries": 532, "num_filter_entries": 532, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918425, "oldest_key_time": 1768918425, "file_creation_time": 1768918519, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 18872 microseconds, and 7601 cpu microseconds.
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.023441) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1658900 bytes OK
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.023466) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.025457) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.025480) EVENT_LOG_v1 {"time_micros": 1768918519025473, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.025501) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2523798, prev total WAL file size 2523798, number of live WAL files 2.
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.026723) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1620KB)], [30(8125KB)]
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519026825, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 9979680, "oldest_snapshot_seqno": -1}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4336 keys, 7965313 bytes, temperature: kUnknown
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519099837, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 7965313, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7935272, "index_size": 18091, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 107563, "raw_average_key_size": 24, "raw_value_size": 7855728, "raw_average_value_size": 1811, "num_data_blocks": 759, "num_entries": 4336, "num_filter_entries": 4336, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918519, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.100172) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 7965313 bytes
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.102123) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.5 rd, 108.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 7.9 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(10.8) write-amplify(4.8) OK, records in: 4851, records dropped: 515 output_compression: NoCompression
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.102156) EVENT_LOG_v1 {"time_micros": 1768918519102139, "job": 16, "event": "compaction_finished", "compaction_time_micros": 73132, "compaction_time_cpu_micros": 33848, "output_level": 6, "num_output_files": 1, "total_output_size": 7965313, "num_input_records": 4851, "num_output_records": 4336, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519102929, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519106725, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.026594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.106826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.106834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.106837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.106840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:19.106843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:19.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:19.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:19 np0005588920 python3.9[205192]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:19 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:15:19 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:15:21 np0005588920 python3.9[205345]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:21 np0005588920 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 20 09:15:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:21.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:21.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:22 np0005588920 python3.9[205501]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:22 np0005588920 systemd[1]: Reloading.
Jan 20 09:15:22 np0005588920 podman[205503]: 2026-01-20 14:15:22.271355244 +0000 UTC m=+0.082444123 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 09:15:22 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:15:22 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:15:22 np0005588920 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 20 09:15:22 np0005588920 systemd[1]: Starting Open-iSCSI...
Jan 20 09:15:22 np0005588920 kernel: Loading iSCSI transport class v2.0-870.
Jan 20 09:15:22 np0005588920 systemd[1]: Started Open-iSCSI.
Jan 20 09:15:22 np0005588920 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 20 09:15:22 np0005588920 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 20 09:15:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:23.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:23.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.660378) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524660424, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 304, "num_deletes": 256, "total_data_size": 118358, "memory_usage": 124960, "flush_reason": "Manual Compaction"}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524662890, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 77787, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17529, "largest_seqno": 17827, "table_properties": {"data_size": 75866, "index_size": 149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4505, "raw_average_key_size": 16, "raw_value_size": 72071, "raw_average_value_size": 259, "num_data_blocks": 7, "num_entries": 278, "num_filter_entries": 278, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918519, "oldest_key_time": 1768918519, "file_creation_time": 1768918524, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 2555 microseconds, and 830 cpu microseconds.
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.662938) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 77787 bytes OK
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.662954) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.664515) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.664528) EVENT_LOG_v1 {"time_micros": 1768918524664524, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.664544) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 116142, prev total WAL file size 116142, number of live WAL files 2.
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.665006) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(75KB)], [33(7778KB)]
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524665051, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 8043100, "oldest_snapshot_seqno": -1}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4094 keys, 7701348 bytes, temperature: kUnknown
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524731821, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 7701348, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7673126, "index_size": 16912, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 103837, "raw_average_key_size": 25, "raw_value_size": 7597929, "raw_average_value_size": 1855, "num_data_blocks": 696, "num_entries": 4094, "num_filter_entries": 4094, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918524, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.732120) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 7701348 bytes
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.734402) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.3 rd, 115.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 7.6 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(202.4) write-amplify(99.0) OK, records in: 4614, records dropped: 520 output_compression: NoCompression
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.734438) EVENT_LOG_v1 {"time_micros": 1768918524734424, "job": 18, "event": "compaction_finished", "compaction_time_micros": 66853, "compaction_time_cpu_micros": 30548, "output_level": 6, "num_output_files": 1, "total_output_size": 7701348, "num_input_records": 4614, "num_output_records": 4094, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524734618, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524737499, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.664893) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.737638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.737646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.737649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.737652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:15:24.737655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588920 python3.9[205720]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:15:24 np0005588920 network[205737]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:15:24 np0005588920 network[205738]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:15:24 np0005588920 network[205739]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:15:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:25.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:25.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:27.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:29.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:29.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:31.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:31.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:31 np0005588920 python3.9[206011]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:15:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:33.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:34 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:15:34 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:15:34 np0005588920 systemd[1]: Reloading.
Jan 20 09:15:35 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:15:35 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:15:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:35.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:35.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:35 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:15:35 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:15:35 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:15:35 np0005588920 systemd[1]: run-ra8d3b766793f416680f31e3881445e29.service: Deactivated successfully.
Jan 20 09:15:36 np0005588920 python3.9[206327]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 20 09:15:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:37 np0005588920 python3.9[206479]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 20 09:15:38 np0005588920 python3.9[206635]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:39 np0005588920 python3.9[206758]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918538.00261-512-164776511642408/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:39.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:40 np0005588920 podman[206910]: 2026-01-20 14:15:40.343165124 +0000 UTC m=+0.150306163 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:15:40 np0005588920 python3.9[206911]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:41.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:41 np0005588920 python3.9[207088]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:15:41 np0005588920 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 20 09:15:41 np0005588920 systemd[1]: Stopped Load Kernel Modules.
Jan 20 09:15:41 np0005588920 systemd[1]: Stopping Load Kernel Modules...
Jan 20 09:15:41 np0005588920 systemd[1]: Starting Load Kernel Modules...
Jan 20 09:15:41 np0005588920 systemd[1]: Finished Load Kernel Modules.
Jan 20 09:15:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:42 np0005588920 python3.9[207244]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:43.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:43 np0005588920 python3.9[207397]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:44 np0005588920 python3.9[207549]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:45.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:45 np0005588920 python3.9[207672]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918544.0618367-664-72980257056690/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:45.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:45 np0005588920 python3.9[207824]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:46 np0005588920 python3.9[207977]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:47.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:47.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:47 np0005588920 python3.9[208129]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:48 np0005588920 python3.9[208281]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:49.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:49 np0005588920 python3.9[208433]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:50 np0005588920 python3.9[208585]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:50 np0005588920 python3.9[208737]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:51.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:51 np0005588920 python3.9[208889]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:52 np0005588920 python3.9[209041]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:53 np0005588920 podman[209092]: 2026-01-20 14:15:53.010059019 +0000 UTC m=+0.082568536 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:15:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:53.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:53.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:53 np0005588920 python3.9[209215]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:54 np0005588920 python3.9[209368]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:54 np0005588920 systemd[1]: Listening on multipathd control socket.
Jan 20 09:15:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:55.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:55.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:55 np0005588920 python3.9[209524]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:55 np0005588920 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 20 09:15:55 np0005588920 udevadm[209529]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 20 09:15:55 np0005588920 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 20 09:15:55 np0005588920 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 20 09:15:55 np0005588920 multipathd[209532]: --------start up--------
Jan 20 09:15:55 np0005588920 multipathd[209532]: read /etc/multipath.conf
Jan 20 09:15:55 np0005588920 multipathd[209532]: path checkers start up
Jan 20 09:15:55 np0005588920 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 20 09:15:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:57.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:57.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:58 np0005588920 python3.9[209691]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 20 09:15:58 np0005588920 python3.9[209843]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 20 09:15:58 np0005588920 kernel: Key type psk registered
Jan 20 09:15:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:59.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:15:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:59.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:59 np0005588920 python3.9[210004]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:16:00 np0005588920 python3.9[210127]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918559.2620595-1054-41056368914967/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:01 np0005588920 python3.9[210279]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:01.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:02 np0005588920 python3.9[210431]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:02 np0005588920 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 20 09:16:02 np0005588920 systemd[1]: Stopped Load Kernel Modules.
Jan 20 09:16:02 np0005588920 systemd[1]: Stopping Load Kernel Modules...
Jan 20 09:16:02 np0005588920 systemd[1]: Starting Load Kernel Modules...
Jan 20 09:16:02 np0005588920 systemd[1]: Finished Load Kernel Modules.
Jan 20 09:16:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:03.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:03 np0005588920 python3.9[210587]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:16:04 np0005588920 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 20 09:16:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:05.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:05 np0005588920 systemd[1]: Reloading.
Jan 20 09:16:05 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:05 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:06 np0005588920 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 20 09:16:06 np0005588920 systemd[1]: Reloading.
Jan 20 09:16:06 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:06 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:06 np0005588920 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 20 09:16:06 np0005588920 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 20 09:16:06 np0005588920 lvm[210814]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 09:16:06 np0005588920 lvm[210814]: VG ceph_vg0 finished
Jan 20 09:16:06 np0005588920 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:16:06 np0005588920 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:16:06 np0005588920 systemd[1]: Reloading.
Jan 20 09:16:06 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:06 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:16:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:16:07 np0005588920 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:16:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:07.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:07.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:08 np0005588920 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:16:08 np0005588920 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:16:08 np0005588920 systemd[1]: man-db-cache-update.service: Consumed 1.381s CPU time.
Jan 20 09:16:08 np0005588920 systemd[1]: run-rf7ef522c2cea49e7a27054c0d4de5a78.service: Deactivated successfully.
Jan 20 09:16:08 np0005588920 python3.9[212191]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:08 np0005588920 systemd[1]: Stopping Open-iSCSI...
Jan 20 09:16:08 np0005588920 iscsid[205561]: iscsid shutting down.
Jan 20 09:16:08 np0005588920 systemd[1]: iscsid.service: Deactivated successfully.
Jan 20 09:16:08 np0005588920 systemd[1]: Stopped Open-iSCSI.
Jan 20 09:16:08 np0005588920 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 20 09:16:08 np0005588920 systemd[1]: Starting Open-iSCSI...
Jan 20 09:16:08 np0005588920 systemd[1]: Started Open-iSCSI.
Jan 20 09:16:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:09.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:16:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:09.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:16:09 np0005588920 python3.9[212348]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:09 np0005588920 multipathd[209532]: exit (signal)
Jan 20 09:16:09 np0005588920 multipathd[209532]: --------shut down-------
Jan 20 09:16:09 np0005588920 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 20 09:16:09 np0005588920 systemd[1]: multipathd.service: Deactivated successfully.
Jan 20 09:16:09 np0005588920 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 20 09:16:09 np0005588920 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 20 09:16:09 np0005588920 multipathd[212354]: --------start up--------
Jan 20 09:16:09 np0005588920 multipathd[212354]: read /etc/multipath.conf
Jan 20 09:16:09 np0005588920 multipathd[212354]: path checkers start up
Jan 20 09:16:09 np0005588920 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 20 09:16:10 np0005588920 python3.9[212511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:16:11 np0005588920 podman[212516]: 2026-01-20 14:16:11.040763329 +0000 UTC m=+0.109740141 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:16:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:11 np0005588920 python3.9[212693]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:13 np0005588920 python3.9[212845]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:16:13 np0005588920 systemd[1]: Reloading.
Jan 20 09:16:13 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:13 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:13.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:14 np0005588920 python3.9[213030]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:16:14 np0005588920 network[213047]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:16:14 np0005588920 network[213048]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:16:14 np0005588920 network[213049]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:16:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:15.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:16:16.418 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:16:16.419 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:16:16.419 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:16:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:17.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:17 np0005588920 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 20 09:16:17 np0005588920 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 20 09:16:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:19.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:19.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:21 np0005588920 python3.9[213374]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:21.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:21 np0005588920 python3.9[213527]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:22 np0005588920 python3.9[213680]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:23.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:23 np0005588920 podman[213805]: 2026-01-20 14:16:23.450035941 +0000 UTC m=+0.085927900 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:16:23 np0005588920 python3.9[213848]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:24 np0005588920 python3.9[214005]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:25.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:25.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:26 np0005588920 python3.9[214158]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:26 np0005588920 python3.9[214311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:27.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:28 np0005588920 python3.9[214464]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:29.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:29 np0005588920 python3.9[214617]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:30 np0005588920 python3.9[214769]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:31.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:31 np0005588920 python3.9[214921]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:31.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:32 np0005588920 python3.9[215073]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:32 np0005588920 python3.9[215225]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:33.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:33 np0005588920 python3.9[215377]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:34 np0005588920 python3.9[215529]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:35 np0005588920 python3.9[215681]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:35.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:35.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:36 np0005588920 python3.9[215833]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:36 np0005588920 python3.9[215985]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:37.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:37.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:37 np0005588920 python3.9[216137]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:38 np0005588920 python3.9[216289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:38 np0005588920 python3.9[216441]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:39.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:39.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:39 np0005588920 python3.9[216593]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:40 np0005588920 python3.9[216745]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:41 np0005588920 podman[216897]: 2026-01-20 14:16:41.332741225 +0000 UTC m=+0.194452007 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 20 09:16:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:41.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:41 np0005588920 python3.9[216898]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:41.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:42 np0005588920 python3.9[217076]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:43.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:43 np0005588920 python3.9[217228]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:16:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:44 np0005588920 python3.9[217380]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:16:44 np0005588920 systemd[1]: Reloading.
Jan 20 09:16:44 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:44 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:45.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:45.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:45 np0005588920 python3.9[217566]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:46 np0005588920 python3.9[217719]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:46 np0005588920 python3.9[217872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:16:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:47.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:16:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:47.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:47 np0005588920 python3.9[218025]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:48 np0005588920 python3.9[218178]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:48 np0005588920 python3.9[218331]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:49.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:49.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:49 np0005588920 python3.9[218484]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:50 np0005588920 python3.9[218637]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:51.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:51.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:53.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:53 np0005588920 python3.9[218790]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:53.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:53 np0005588920 podman[218895]: 2026-01-20 14:16:53.989726251 +0000 UTC m=+0.071332389 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Jan 20 09:16:54 np0005588920 python3.9[218961]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:54 np0005588920 python3.9[219114]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:55.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:55.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:55 np0005588920 python3.9[219266]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:56 np0005588920 python3.9[219418]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:56 np0005588920 python3.9[219570]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:57.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:57.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:57 np0005588920 python3.9[219722]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:58 np0005588920 python3.9[219874]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:59 np0005588920 python3.9[220026]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:59.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:16:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:16:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:16:59 np0005588920 python3.9[220178]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:01.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:01.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:03.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:03.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:05.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:05.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588920 python3.9[220330]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 20 09:17:06 np0005588920 python3.9[220483]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:17:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:07.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:07.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:07 np0005588920 python3.9[220641]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 09:17:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:09.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:09.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:09 np0005588920 systemd-logind[783]: New session 50 of user zuul.
Jan 20 09:17:09 np0005588920 systemd[1]: Started Session 50 of User zuul.
Jan 20 09:17:09 np0005588920 systemd[1]: session-50.scope: Deactivated successfully.
Jan 20 09:17:09 np0005588920 systemd-logind[783]: Session 50 logged out. Waiting for processes to exit.
Jan 20 09:17:09 np0005588920 systemd-logind[783]: Removed session 50.
Jan 20 09:17:10 np0005588920 python3.9[220827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:11 np0005588920 python3.9[220948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918630.170517-2662-223733713216510/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:11.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:11.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:11 np0005588920 podman[221072]: 2026-01-20 14:17:11.630388797 +0000 UTC m=+0.091546216 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 20 09:17:11 np0005588920 python3.9[221113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:12 np0005588920 python3.9[221200]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:12 np0005588920 python3.9[221350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:13.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:13 np0005588920 python3.9[221471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918632.256888-2662-221097945092453/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:13.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:14 np0005588920 python3.9[221621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:14 np0005588920 python3.9[221742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918633.575495-2662-218504598916130/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:15 np0005588920 python3.9[221892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:15.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:15.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:16 np0005588920 python3.9[222013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918634.914987-2662-111366924813447/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:17:16.419 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:17:16.419 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:17:16.420 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:17:16 np0005588920 python3.9[222295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:17.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:17 np0005588920 python3.9[222416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918636.2730157-2662-132197409961368/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:17:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:17.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:18 np0005588920 python3.9[222568]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:19.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:19 np0005588920 python3.9[222720]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:19.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:20 np0005588920 python3.9[222872]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:21 np0005588920 python3.9[223024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:21.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:21.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:21 np0005588920 python3.9[223147]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1768918640.61751-2983-274278018233942/.source _original_basename=.hsu3vk7c follow=False checksum=9861b3b7d2b77aa94d2d681766e21b7660b8af83 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 20 09:17:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:22 np0005588920 python3.9[223299]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:23.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:23 np0005588920 python3.9[223451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:23.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:23 np0005588920 python3.9[223572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918642.9367201-3061-99792923836788/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:24 np0005588920 podman[223696]: 2026-01-20 14:17:24.669886774 +0000 UTC m=+0.084869175 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:17:24 np0005588920 python3.9[223732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:25.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:25.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:25 np0005588920 python3.9[223912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918644.2685301-3106-220097860415607/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:26 np0005588920 python3.9[224064]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 20 09:17:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:27.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:27.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:27 np0005588920 python3.9[224216]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:17:29 np0005588920 python3[224368]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:17:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:29.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:31.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:31.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:33.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:35.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:37.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:37.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:38 np0005588920 ceph-mds[83715]: mds.beacon.cephfs.compute-2.jyxktq missed beacon ack from the monitors
Jan 20 09:17:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:39.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:39.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:39 np0005588920 podman[224379]: 2026-01-20 14:17:39.589422575 +0000 UTC m=+10.298721622 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:39 np0005588920 podman[224457]: 2026-01-20 14:17:39.827440825 +0000 UTC m=+0.080544982 container create 051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:17:39 np0005588920 podman[224457]: 2026-01-20 14:17:39.789332706 +0000 UTC m=+0.042436913 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:39 np0005588920 python3[224368]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 20 09:17:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:41.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:42 np0005588920 podman[224520]: 2026-01-20 14:17:42.041390251 +0000 UTC m=+0.123404496 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:17:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:43.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:43.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:45.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:46 np0005588920 python3.9[224673]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:47.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:47.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:47 np0005588920 python3.9[224827]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 20 09:17:48 np0005588920 python3.9[224979]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:17:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:49.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:49.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:49 np0005588920 python3[225131]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:17:50 np0005588920 podman[225169]: 2026-01-20 14:17:50.209638951 +0000 UTC m=+0.081706235 container create 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute)
Jan 20 09:17:50 np0005588920 podman[225169]: 2026-01-20 14:17:50.169831494 +0000 UTC m=+0.041898848 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:50 np0005588920 python3[225131]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 20 09:17:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:51.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:51 np0005588920 python3.9[225359]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:17:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3550 writes, 19K keys, 3550 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3550 writes, 3550 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1373 writes, 6577 keys, 1373 commit groups, 1.0 writes per commit group, ingest: 14.40 MB, 0.02 MB/s#012Interval WAL: 1373 writes, 1373 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     57.0      0.35              0.08         9    0.039       0      0       0.0       0.0#012  L6      1/0    7.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.2     95.0     79.3      0.81              0.28         8    0.101     35K   4271       0.0       0.0#012 Sum      1/0    7.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.2     66.2     72.5      1.16              0.36        17    0.068     35K   4271       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.3     99.5     98.2      0.39              0.17         8    0.049     19K   2485       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     95.0     79.3      0.81              0.28         8    0.101     35K   4271       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     57.3      0.35              0.08         8    0.044       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.07 GB read, 0.06 MB/s read, 1.2 seconds#012Interval compaction: 0.04 GB write, 0.06 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 308.00 MB usage: 4.67 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000103 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(253,4.37 MB,1.4184%) FilterBlock(17,108.98 KB,0.0345552%) IndexBlock(17,203.77 KB,0.0646071%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:17:53 np0005588920 python3.9[225513]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:53.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:17:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:53.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:17:53 np0005588920 python3.9[225664]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918673.2941875-3393-107649440751089/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:54 np0005588920 python3.9[225740]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:17:54 np0005588920 systemd[1]: Reloading.
Jan 20 09:17:54 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:17:54 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:17:54 np0005588920 podman[225775]: 2026-01-20 14:17:54.983480394 +0000 UTC m=+0.094917173 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:17:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:55.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:55.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:55 np0005588920 python3.9[225870]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:17:55 np0005588920 systemd[1]: Reloading.
Jan 20 09:17:55 np0005588920 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:17:55 np0005588920 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:17:56 np0005588920 systemd[1]: Starting nova_compute container...
Jan 20 09:17:56 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:17:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588920 podman[225909]: 2026-01-20 14:17:56.237293271 +0000 UTC m=+0.127876584 container init 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:17:56 np0005588920 podman[225909]: 2026-01-20 14:17:56.245340531 +0000 UTC m=+0.135923814 container start 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Jan 20 09:17:56 np0005588920 podman[225909]: nova_compute
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + sudo -E kolla_set_configs
Jan 20 09:17:56 np0005588920 systemd[1]: Started nova_compute container.
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Validating config file
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying service configuration files
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Deleting /etc/ceph
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Creating directory /etc/ceph
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/ceph
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Writing out command to execute
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588920 nova_compute[225924]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588920 nova_compute[225924]: ++ cat /run_command
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + CMD=nova-compute
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + ARGS=
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + sudo kolla_copy_cacerts
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + [[ ! -n '' ]]
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + . kolla_extend_start
Jan 20 09:17:56 np0005588920 nova_compute[225924]: Running command: 'nova-compute'
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + echo 'Running command: '\''nova-compute'\'''
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + umask 0022
Jan 20 09:17:56 np0005588920 nova_compute[225924]: + exec nova-compute
Jan 20 09:17:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:57.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:57.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:57 np0005588920 python3.9[226086]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.464 225928 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.464 225928 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.464 225928 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.465 225928 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 20 09:17:58 np0005588920 python3.9[226238]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.607 225928 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.633 225928 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:17:58 np0005588920 nova_compute[225924]: 2026-01-20 14:17:58.634 225928 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.379 225928 INFO nova.virt.driver [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 20 09:17:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:59.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:17:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:59.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:59 np0005588920 python3.9[226390]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.617 225928 INFO nova.compute.provider_config [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.642 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.642 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.643 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.643 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.644 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.644 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.644 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.644 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.645 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.645 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.645 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.645 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.646 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.646 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.646 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.646 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.647 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.647 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.647 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.647 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.648 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.648 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.648 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.648 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.648 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.649 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.649 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.649 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.649 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.650 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.650 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.650 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.650 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.651 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.651 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.651 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.651 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.652 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.652 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.652 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.652 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.653 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.653 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.653 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.653 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.654 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.654 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.654 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.654 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.655 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.655 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.655 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.655 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.656 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.656 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.656 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.656 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.657 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.657 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.657 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.657 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.657 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.658 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.658 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.658 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.658 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.659 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.659 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.659 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.659 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.659 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.660 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.660 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.660 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.660 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.661 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.661 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.661 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.661 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.662 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.662 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.662 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.662 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.663 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.663 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.663 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.663 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.664 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.664 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.664 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.664 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.664 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.665 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.665 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.665 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.665 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.666 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.666 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.666 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.666 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.666 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.667 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.667 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.667 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.667 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.668 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.668 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.668 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.668 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.669 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.669 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.669 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.669 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.669 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.670 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.670 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.670 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.670 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.671 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.671 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.671 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.671 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.671 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.672 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.672 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.672 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.672 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.673 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.673 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.673 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.673 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.673 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.674 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.674 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.674 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.674 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.674 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.675 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.676 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.677 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.678 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.679 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.680 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.681 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.682 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.683 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.684 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.685 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.686 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.687 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.688 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.689 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.690 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.691 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.692 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.693 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.694 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.695 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.696 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.697 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.698 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.699 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.700 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.701 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.702 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.703 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.704 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.705 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.706 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.707 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.708 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.709 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.710 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.711 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.712 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.713 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.714 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.715 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.716 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.717 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.718 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.719 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.720 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.721 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.722 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.723 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 WARNING oslo_config.cfg [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 20 09:17:59 np0005588920 nova_compute[225924]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 20 09:17:59 np0005588920 nova_compute[225924]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 20 09:17:59 np0005588920 nova_compute[225924]: and ``live_migration_inbound_addr`` respectively.
Jan 20 09:17:59 np0005588920 nova_compute[225924]: ).  Its value may be silently ignored in the future.#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.724 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.725 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.726 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rbd_secret_uuid        = e399cf45-e6b6-5393-99f1-75c601d3f188 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.727 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.728 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.729 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.730 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.731 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.732 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.733 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.734 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.735 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.736 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.737 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.738 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.739 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.740 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.741 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.742 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.743 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.744 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.745 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.746 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.747 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.748 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.749 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.750 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.751 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.752 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.753 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.754 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.755 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.756 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.757 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.758 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.759 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.760 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.761 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.762 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.763 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.764 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.765 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.766 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.767 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.768 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.769 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.770 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.771 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.772 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.773 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.774 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.775 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.776 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.777 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.778 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.779 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.780 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.781 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.782 225928 DEBUG oslo_service.service [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.783 225928 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.846 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.847 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.847 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.847 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 20 09:17:59 np0005588920 systemd[1]: Starting libvirt QEMU daemon...
Jan 20 09:17:59 np0005588920 systemd[1]: Started libvirt QEMU daemon.
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.919 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7feabf8cf580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.924 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7feabf8cf580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.925 225928 INFO nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.956 225928 WARNING nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 20 09:17:59 np0005588920 nova_compute[225924]: 2026-01-20 14:17:59.956 225928 DEBUG nova.virt.libvirt.volume.mount [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.789 225928 INFO nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Libvirt host capabilities <capabilities>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <host>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <uuid>1190ec40-2b89-4358-8dec-733c5829fbed</uuid>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <cpu>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <arch>x86_64</arch>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model>EPYC-Rome-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <vendor>AMD</vendor>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <microcode version='16777317'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <signature family='23' model='49' stepping='0'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='x2apic'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='tsc-deadline'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='osxsave'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='hypervisor'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='tsc_adjust'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='spec-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='stibp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='arch-capabilities'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='cmp_legacy'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='topoext'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='virt-ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='lbrv'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='tsc-scale'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='vmcb-clean'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='pause-filter'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='pfthreshold'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='svme-addr-chk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='rdctl-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='skip-l1dfl-vmentry'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='mds-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature name='pschange-mc-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <pages unit='KiB' size='4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <pages unit='KiB' size='2048'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <pages unit='KiB' size='1048576'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </cpu>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <power_management>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <suspend_mem/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </power_management>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <iommu support='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <migration_features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <live/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <uri_transports>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <uri_transport>tcp</uri_transport>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <uri_transport>rdma</uri_transport>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </uri_transports>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </migration_features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <topology>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <cells num='1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <cell id='0'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <memory unit='KiB'>7864308</memory>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <pages unit='KiB' size='2048'>0</pages>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <distances>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <sibling id='0' value='10'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          </distances>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          <cpus num='8'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:          </cpus>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        </cell>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </cells>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </topology>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <cache>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </cache>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <secmodel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model>selinux</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <doi>0</doi>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </secmodel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <secmodel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model>dac</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <doi>0</doi>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </secmodel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </host>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <guest>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <os_type>hvm</os_type>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <arch name='i686'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <wordsize>32</wordsize>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <domain type='qemu'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <domain type='kvm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </arch>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <pae/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <nonpae/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <apic default='on' toggle='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <cpuselection/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <deviceboot/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <externalSnapshot/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </guest>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <guest>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <os_type>hvm</os_type>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <arch name='x86_64'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <wordsize>64</wordsize>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <domain type='qemu'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <domain type='kvm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </arch>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <apic default='on' toggle='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <cpuselection/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <deviceboot/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <externalSnapshot/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </guest>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 
Jan 20 09:18:00 np0005588920 nova_compute[225924]: </capabilities>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: #033[00m
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.817 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.849 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 20 09:18:00 np0005588920 nova_compute[225924]: <domainCapabilities>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <domain>kvm</domain>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <arch>i686</arch>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <vcpu max='4096'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <iothreads supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <os supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <enum name='firmware'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <loader supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>rom</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pflash</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='readonly'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>yes</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='secure'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </loader>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </os>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <cpu>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='maximumMigratable'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <vendor>AMD</vendor>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='succor'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='custom' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Denverton'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v5'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 python3.9[226602]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SierraForest'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Snowridge'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='athlon'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='athlon-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='core2duo'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='core2duo-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='coreduo'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='coreduo-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='n270'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='n270-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='phenom'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='phenom-v1'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </cpu>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <memoryBacking supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <enum name='sourceType'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <value>file</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <value>anonymous</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <value>memfd</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </memoryBacking>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <devices>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <disk supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='diskDevice'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>disk</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>cdrom</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>floppy</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>lun</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>fdc</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>sata</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </disk>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <graphics supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vnc</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>egl-headless</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </graphics>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <video supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='modelType'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vga</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>cirrus</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>none</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>bochs</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>ramfb</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </video>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <hostdev supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='mode'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>subsystem</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='startupPolicy'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>mandatory</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>requisite</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>optional</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='subsysType'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pci</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='capsType'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='pciBackend'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </hostdev>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <rng supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>random</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>egd</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </rng>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <filesystem supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='driverType'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>path</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>handle</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>virtiofs</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </filesystem>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <tpm supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>tpm-tis</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>tpm-crb</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>emulator</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>external</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='backendVersion'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>2.0</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </tpm>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <redirdev supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </redirdev>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <channel supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </channel>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <crypto supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='model'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>qemu</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </crypto>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <interface supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='backendType'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>passt</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </interface>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <panic supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>isa</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>hyperv</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </panic>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <console supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>null</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vc</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>dev</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>file</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pipe</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>stdio</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>udp</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>tcp</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>qemu-vdagent</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </console>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </devices>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <gic supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <genid supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <backup supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <async-teardown supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <s390-pv supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <ps2 supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <tdx supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <sev supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <sgx supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <hyperv supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='features'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>relaxed</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vapic</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>spinlocks</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vpindex</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>runtime</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>synic</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>stimer</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>reset</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>vendor_id</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>frequencies</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>reenlightenment</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>tlbflush</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>ipi</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>avic</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>emsr_bitmap</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>xmm_input</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <defaults>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </defaults>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </hyperv>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <launchSecurity supported='no'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </features>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: </domainCapabilities>
Jan 20 09:18:00 np0005588920 nova_compute[225924]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:00 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.860 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 20 09:18:00 np0005588920 nova_compute[225924]: <domainCapabilities>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <domain>kvm</domain>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <arch>i686</arch>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <vcpu max='240'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <iothreads supported='yes'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <os supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <enum name='firmware'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <loader supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>rom</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>pflash</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='readonly'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>yes</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='secure'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </loader>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  </os>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:  <cpu>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <enum name='maximumMigratable'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <vendor>AMD</vendor>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='succor'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:    <mode name='custom' supported='yes'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>memfd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </memoryBacking>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>disk</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>floppy</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>lun</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ide</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>fdc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>sata</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </disk>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vnc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </graphics>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <video supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vga</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>none</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>bochs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </video>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='mode'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>requisite</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>optional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pci</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hostdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>random</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </rng>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>path</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>handle</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </filesystem>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emulator</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>external</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>2.0</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </tpm>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </redirdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </channel>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </crypto>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>passt</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </interface>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>isa</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </panic>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <console supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>null</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dev</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pipe</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stdio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>udp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tcp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </console>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='features'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vapic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>runtime</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>synic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stimer</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reset</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ipi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>avic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hyperv>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: </domainCapabilities>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.981 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:00.987 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 20 09:18:01 np0005588920 nova_compute[225924]: <domainCapabilities>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <domain>kvm</domain>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <arch>x86_64</arch>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <vcpu max='240'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <iothreads supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <os supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <enum name='firmware'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <loader supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>rom</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pflash</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='readonly'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>yes</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='secure'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </loader>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </os>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='maximumMigratable'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <vendor>AMD</vendor>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='succor'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='custom' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>memfd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </memoryBacking>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>disk</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>floppy</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>lun</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ide</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>fdc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>sata</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </disk>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vnc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </graphics>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <video supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vga</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>none</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>bochs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </video>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='mode'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>requisite</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>optional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pci</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hostdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>random</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </rng>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>path</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>handle</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </filesystem>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emulator</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>external</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>2.0</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </tpm>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </redirdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </channel>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </crypto>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>passt</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </interface>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>isa</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </panic>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <console supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>null</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dev</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pipe</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stdio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>udp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tcp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </console>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='features'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vapic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>runtime</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>synic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stimer</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reset</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ipi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>avic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hyperv>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: </domainCapabilities>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.055 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 20 09:18:01 np0005588920 nova_compute[225924]: <domainCapabilities>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <domain>kvm</domain>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <arch>x86_64</arch>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <vcpu max='4096'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <iothreads supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <os supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <enum name='firmware'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>efi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <loader supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>rom</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pflash</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='readonly'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>yes</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='secure'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>yes</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>no</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </loader>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </os>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='maximumMigratable'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>on</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>off</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <vendor>AMD</vendor>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='succor'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <mode name='custom' supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </blockers>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </mode>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <value>memfd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </memoryBacking>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>disk</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>floppy</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>lun</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>fdc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>sata</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </disk>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vnc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </graphics>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <video supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vga</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>none</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>bochs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </video>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='mode'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>requisite</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>optional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pci</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>scsi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hostdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>random</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>egd</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </rng>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>path</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>handle</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </filesystem>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emulator</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>external</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>2.0</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </tpm>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='bus'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>usb</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </redirdev>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </channel>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>builtin</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </crypto>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>default</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>passt</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </interface>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='model'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>isa</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </panic>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <console supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='type'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>null</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vc</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pty</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dev</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>file</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>pipe</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stdio</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>udp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tcp</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>unix</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>dbus</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </console>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </devices>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <enum name='features'>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vapic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>runtime</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>synic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>stimer</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reset</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>ipi</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>avic</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </enum>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      <defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:      </defaults>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    </hyperv>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  </features>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: </domainCapabilities>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.163 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.163 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.163 225928 DEBUG nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.169 225928 INFO nova.virt.libvirt.host [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Secure Boot support detected#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.171 225928 INFO nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.171 225928 INFO nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.181 225928 DEBUG nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 20 09:18:01 np0005588920 nova_compute[225924]:  <model>Nehalem</model>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: </cpu>
Jan 20 09:18:01 np0005588920 nova_compute[225924]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.184 225928 DEBUG nova.virt.libvirt.driver [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.310 225928 INFO nova.virt.node [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Determined node identity ff38e91c-3320-4831-90ac-bcffc89ba7b6 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.342 225928 WARNING nova.compute.manager [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Compute nodes ['ff38e91c-3320-4831-90ac-bcffc89ba7b6'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.405 225928 INFO nova.compute.manager [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 20 09:18:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:01.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.482 225928 WARNING nova.compute.manager [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.483 225928 DEBUG oslo_concurrency.lockutils [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.483 225928 DEBUG oslo_concurrency.lockutils [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.484 225928 DEBUG oslo_concurrency.lockutils [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.485 225928 DEBUG nova.compute.resource_tracker [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.485 225928 DEBUG oslo_concurrency.processutils [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:01.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3870360195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:01 np0005588920 nova_compute[225924]: 2026-01-20 14:18:01.932 225928 DEBUG oslo_concurrency.processutils [None req-03ea53aa-58ec-4710-a186-99b9f32b78d2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:01 np0005588920 systemd[1]: Starting libvirt nodedev daemon...
Jan 20 09:18:01 np0005588920 systemd[1]: Started libvirt nodedev daemon.
Jan 20 09:18:02 np0005588920 python3.9[226800]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:18:02 np0005588920 systemd[1]: Stopping nova_compute container...
Jan 20 09:18:02 np0005588920 nova_compute[225924]: 2026-01-20 14:18:02.212 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:18:02 np0005588920 nova_compute[225924]: 2026-01-20 14:18:02.213 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:18:02 np0005588920 nova_compute[225924]: 2026-01-20 14:18:02.213 225928 DEBUG oslo_concurrency.lockutils [None req-f602176d-ab1b-4bbb-b39c-fb4897562406 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:18:02 np0005588920 virtqemud[226436]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 20 09:18:02 np0005588920 virtqemud[226436]: hostname: compute-2
Jan 20 09:18:02 np0005588920 virtqemud[226436]: End of file while reading data: Input/output error
Jan 20 09:18:02 np0005588920 systemd[1]: libpod-1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d.scope: Deactivated successfully.
Jan 20 09:18:02 np0005588920 systemd[1]: libpod-1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d.scope: Consumed 3.645s CPU time.
Jan 20 09:18:02 np0005588920 podman[226829]: 2026-01-20 14:18:02.622969538 +0000 UTC m=+0.456268805 container died 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:18:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d-userdata-shm.mount: Deactivated successfully.
Jan 20 09:18:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay-cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0-merged.mount: Deactivated successfully.
Jan 20 09:18:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:03.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:03.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:04 np0005588920 podman[226829]: 2026-01-20 14:18:04.374958806 +0000 UTC m=+2.208258083 container cleanup 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0)
Jan 20 09:18:04 np0005588920 podman[226829]: nova_compute
Jan 20 09:18:04 np0005588920 podman[226857]: nova_compute
Jan 20 09:18:04 np0005588920 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 20 09:18:04 np0005588920 systemd[1]: Stopped nova_compute container.
Jan 20 09:18:04 np0005588920 systemd[1]: Starting nova_compute container...
Jan 20 09:18:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:04 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:18:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde3db8b5d4ce0f158862cd6e774e80423d7590fe6fdc1c3663798e14454dfa0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:04 np0005588920 podman[226870]: 2026-01-20 14:18:04.621300754 +0000 UTC m=+0.110182919 container init 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:18:04 np0005588920 podman[226870]: 2026-01-20 14:18:04.633325877 +0000 UTC m=+0.122208012 container start 1a5c0888e05931f5eb52626a06d30a5cc7a8d117be43fa00a7149c8a3563bf3d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 20 09:18:04 np0005588920 podman[226870]: nova_compute
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + sudo -E kolla_set_configs
Jan 20 09:18:04 np0005588920 systemd[1]: Started nova_compute container.
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Validating config file
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying service configuration files
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /etc/ceph
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Creating directory /etc/ceph
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/ceph
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Writing out command to execute
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:04 np0005588920 nova_compute[226886]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:18:04 np0005588920 nova_compute[226886]: ++ cat /run_command
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + CMD=nova-compute
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + ARGS=
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + sudo kolla_copy_cacerts
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + [[ ! -n '' ]]
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + . kolla_extend_start
Jan 20 09:18:04 np0005588920 nova_compute[226886]: Running command: 'nova-compute'
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + echo 'Running command: '\''nova-compute'\'''
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + umask 0022
Jan 20 09:18:04 np0005588920 nova_compute[226886]: + exec nova-compute
Jan 20 09:18:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:05.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:05.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.555 226890 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.556 226890 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.556 226890 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.556 226890 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.687 226890 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.715 226890 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:06 np0005588920 nova_compute[226886]: 2026-01-20 14:18:06.716 226890 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:18:06 np0005588920 python3.9[227052]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 20 09:18:07 np0005588920 systemd[1]: Started libpod-conmon-051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84.scope.
Jan 20 09:18:07 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:18:07 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c83bcb055df06c9e21dc3d01f1707fd4e93ea5b7645eb41f047ba395070af4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c83bcb055df06c9e21dc3d01f1707fd4e93ea5b7645eb41f047ba395070af4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c83bcb055df06c9e21dc3d01f1707fd4e93ea5b7645eb41f047ba395070af4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588920 podman[227079]: 2026-01-20 14:18:07.175804648 +0000 UTC m=+0.189165335 container init 051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_managed=true)
Jan 20 09:18:07 np0005588920 podman[227079]: 2026-01-20 14:18:07.189011875 +0000 UTC m=+0.202372502 container start 051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm)
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.190 226890 INFO nova.virt.driver [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 20 09:18:07 np0005588920 python3.9[227052]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Applying nova statedir ownership
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 20 09:18:07 np0005588920 nova_compute_init[227100]: INFO:nova_statedir:Nova statedir ownership complete
Jan 20 09:18:07 np0005588920 systemd[1]: libpod-051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588920 podman[227101]: 2026-01-20 14:18:07.278538463 +0000 UTC m=+0.050008280 container died 051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:18:07 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84-userdata-shm.mount: Deactivated successfully.
Jan 20 09:18:07 np0005588920 systemd[1]: var-lib-containers-storage-overlay-00c83bcb055df06c9e21dc3d01f1707fd4e93ea5b7645eb41f047ba395070af4-merged.mount: Deactivated successfully.
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.357 226890 INFO nova.compute.provider_config [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 20 09:18:07 np0005588920 podman[227114]: 2026-01-20 14:18:07.363876121 +0000 UTC m=+0.073808150 container cleanup 051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:18:07 np0005588920 systemd[1]: libpod-conmon-051218544c4622ef56308610ff7daeab4f240fcc2e5996468298508725d68a84.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.376 226890 DEBUG oslo_concurrency.lockutils [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.377 226890 DEBUG oslo_concurrency.lockutils [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.377 226890 DEBUG oslo_concurrency.lockutils [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.377 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.377 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.377 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.378 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.379 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.380 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.381 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.382 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.383 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.384 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.385 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.386 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.387 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.388 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.389 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.390 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.391 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.392 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.393 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.394 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.395 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.396 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.397 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.398 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.399 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.400 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.401 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.402 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.403 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.404 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.405 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.406 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.407 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.408 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.409 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.410 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.411 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.412 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.413 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.413 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.413 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.413 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.413 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.414 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.415 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.416 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.417 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.417 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.417 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.417 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.417 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.418 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.418 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.418 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.418 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.418 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.419 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.419 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.419 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.419 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.420 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.420 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.421 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.422 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.423 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.424 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.425 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.426 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.427 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.428 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.429 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.430 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.430 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.430 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.430 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.430 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.431 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.432 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.433 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.434 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.435 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.436 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.437 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.438 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.439 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.440 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.440 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.440 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.440 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.440 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.441 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.442 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.443 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.444 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.445 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.446 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.447 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.448 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.449 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.450 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.450 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.450 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.450 226890 WARNING oslo_config.cfg [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 20 09:18:07 np0005588920 nova_compute[226886]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 20 09:18:07 np0005588920 nova_compute[226886]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 20 09:18:07 np0005588920 nova_compute[226886]: and ``live_migration_inbound_addr`` respectively.
Jan 20 09:18:07 np0005588920 nova_compute[226886]: ).  Its value may be silently ignored in the future.#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.450 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.451 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.451 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.451 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.451 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.451 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.452 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.453 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.453 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.453 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.453 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.453 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rbd_secret_uuid        = e399cf45-e6b6-5393-99f1-75c601d3f188 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.454 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.455 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.456 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.457 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.457 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.457 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.457 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.457 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.458 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.459 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.459 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.459 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.459 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.459 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.460 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.461 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.462 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.463 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.464 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.464 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.464 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.464 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.465 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:07.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.466 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.467 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.468 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.468 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.468 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.468 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.468 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.469 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.469 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.469 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.469 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.469 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.470 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.471 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.472 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.472 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.472 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.472 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.472 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.473 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.474 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.475 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.476 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.476 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.476 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.476 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.476 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.477 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.477 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.477 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.477 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.477 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.478 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.479 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.480 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.481 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.481 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.481 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.481 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.481 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.482 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.483 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.484 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.485 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.486 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.487 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.488 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.489 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.490 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.491 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.492 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.493 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.494 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.495 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.496 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.497 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.498 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.498 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.498 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.498 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.498 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.499 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.500 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.501 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.502 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.503 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.504 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.505 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.506 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.507 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.508 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.509 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.510 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.511 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.512 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.513 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.514 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.515 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.516 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.517 226890 DEBUG oslo_service.service [None req-0939a9ff-3d3c-46a2-9c32-9b308a52a6f3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.518 226890 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 20 09:18:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:07.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.551 226890 INFO nova.virt.node [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Determined node identity ff38e91c-3320-4831-90ac-bcffc89ba7b6 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.552 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.553 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.553 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.553 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.565 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3bac808f40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.567 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3bac808f40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.568 226890 INFO nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.573 226890 INFO nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt host capabilities <capabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <host>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <uuid>1190ec40-2b89-4358-8dec-733c5829fbed</uuid>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <arch>x86_64</arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model>EPYC-Rome-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <vendor>AMD</vendor>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <microcode version='16777317'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <signature family='23' model='49' stepping='0'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='x2apic'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='tsc-deadline'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='osxsave'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='hypervisor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='tsc_adjust'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='spec-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='stibp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='arch-capabilities'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='cmp_legacy'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='topoext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='virt-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='lbrv'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='tsc-scale'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='vmcb-clean'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='pause-filter'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='pfthreshold'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='svme-addr-chk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='rdctl-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='skip-l1dfl-vmentry'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='mds-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature name='pschange-mc-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <pages unit='KiB' size='4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <pages unit='KiB' size='2048'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <pages unit='KiB' size='1048576'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <power_management>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <suspend_mem/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </power_management>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <iommu support='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <migration_features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <live/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <uri_transports>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <uri_transport>tcp</uri_transport>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <uri_transport>rdma</uri_transport>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </uri_transports>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </migration_features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <topology>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <cells num='1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <cell id='0'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <memory unit='KiB'>7864308</memory>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <pages unit='KiB' size='2048'>0</pages>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <distances>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <sibling id='0' value='10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          </distances>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          <cpus num='8'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:          </cpus>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        </cell>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </cells>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </topology>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <cache>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </cache>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <secmodel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model>selinux</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <doi>0</doi>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </secmodel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <secmodel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model>dac</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <doi>0</doi>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </secmodel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </host>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <guest>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <os_type>hvm</os_type>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <arch name='i686'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <wordsize>32</wordsize>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <domain type='qemu'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <domain type='kvm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <pae/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <nonpae/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <apic default='on' toggle='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <cpuselection/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <deviceboot/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <externalSnapshot/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </guest>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <guest>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <os_type>hvm</os_type>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <arch name='x86_64'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <wordsize>64</wordsize>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <domain type='qemu'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <domain type='kvm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <apic default='on' toggle='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <cpuselection/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <deviceboot/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <externalSnapshot/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </guest>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </capabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: #033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.578 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.582 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 20 09:18:07 np0005588920 nova_compute[226886]: <domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <domain>kvm</domain>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <arch>i686</arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <vcpu max='240'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <iothreads supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <os supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='firmware'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <loader supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>rom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pflash</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='readonly'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>yes</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='secure'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </loader>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='maximumMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <vendor>AMD</vendor>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='succor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='custom' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <memoryBacking supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='sourceType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>anonymous</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>memfd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </memoryBacking>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <disk supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='diskDevice'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>disk</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cdrom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>floppy</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>lun</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ide</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>fdc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>sata</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <graphics supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vnc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egl-headless</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <video supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='modelType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vga</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cirrus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>none</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>bochs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ramfb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hostdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='mode'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>subsystem</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='startupPolicy'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>mandatory</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>requisite</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>optional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='subsysType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pci</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='capsType'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='pciBackend'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hostdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <rng supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>random</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <filesystem supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='driverType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>path</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>handle</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtiofs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </filesystem>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tpm supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-tis</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-crb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emulator</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>external</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendVersion'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>2.0</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </tpm>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <redirdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </redirdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <channel supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </channel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <crypto supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </crypto>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <interface supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>passt</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <panic supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>isa</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>hyperv</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </panic>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <console supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>null</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dev</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pipe</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stdio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>udp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tcp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu-vdagent</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <gic supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <genid supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backup supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <async-teardown supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <s390-pv supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <ps2 supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tdx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sev supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sgx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hyperv supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='features'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>relaxed</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vapic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>spinlocks</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vpindex</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>runtime</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>synic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stimer</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reset</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vendor_id</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>frequencies</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reenlightenment</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tlbflush</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ipi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>avic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emsr_bitmap</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>xmm_input</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hyperv>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <launchSecurity supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.590 226890 DEBUG nova.virt.libvirt.volume.mount [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.590 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 20 09:18:07 np0005588920 nova_compute[226886]: <domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <domain>kvm</domain>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <arch>i686</arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <vcpu max='4096'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <iothreads supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <os supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='firmware'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <loader supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>rom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pflash</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='readonly'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>yes</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='secure'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </loader>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='maximumMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <vendor>AMD</vendor>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='succor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='custom' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <memoryBacking supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='sourceType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>anonymous</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>memfd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </memoryBacking>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <disk supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='diskDevice'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>disk</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cdrom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>floppy</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>lun</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>fdc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>sata</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <graphics supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vnc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egl-headless</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <video supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='modelType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vga</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cirrus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>none</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>bochs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ramfb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hostdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='mode'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>subsystem</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='startupPolicy'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>mandatory</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>requisite</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>optional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='subsysType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pci</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='capsType'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='pciBackend'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hostdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <rng supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>random</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <filesystem supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='driverType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>path</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>handle</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtiofs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </filesystem>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tpm supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-tis</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-crb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emulator</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>external</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendVersion'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>2.0</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </tpm>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <redirdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </redirdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <channel supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </channel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <crypto supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </crypto>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <interface supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>passt</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <panic supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>isa</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>hyperv</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </panic>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <console supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>null</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dev</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pipe</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stdio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>udp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tcp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu-vdagent</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <gic supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <genid supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backup supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <async-teardown supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <s390-pv supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <ps2 supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tdx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sev supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sgx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hyperv supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='features'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>relaxed</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vapic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>spinlocks</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vpindex</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>runtime</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>synic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stimer</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reset</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vendor_id</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>frequencies</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reenlightenment</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tlbflush</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ipi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>avic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emsr_bitmap</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>xmm_input</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hyperv>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <launchSecurity supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.669 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.677 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 20 09:18:07 np0005588920 nova_compute[226886]: <domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <domain>kvm</domain>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <arch>x86_64</arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <vcpu max='240'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <iothreads supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <os supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='firmware'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <loader supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>rom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pflash</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='readonly'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>yes</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='secure'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </loader>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='maximumMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <vendor>AMD</vendor>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='succor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='custom' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <memoryBacking supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='sourceType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>anonymous</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>memfd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </memoryBacking>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <disk supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='diskDevice'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>disk</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cdrom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>floppy</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>lun</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ide</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>fdc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>sata</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <graphics supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vnc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egl-headless</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <video supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='modelType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vga</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cirrus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>none</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>bochs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ramfb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hostdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='mode'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>subsystem</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='startupPolicy'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>mandatory</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>requisite</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>optional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='subsysType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pci</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='capsType'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='pciBackend'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hostdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <rng supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>random</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <filesystem supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='driverType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>path</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>handle</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtiofs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </filesystem>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tpm supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-tis</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-crb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emulator</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>external</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendVersion'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>2.0</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </tpm>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <redirdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </redirdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <channel supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </channel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <crypto supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </crypto>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <interface supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>passt</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <panic supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>isa</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>hyperv</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </panic>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <console supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>null</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dev</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pipe</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stdio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>udp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tcp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu-vdagent</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <gic supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <genid supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backup supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <async-teardown supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <s390-pv supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <ps2 supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tdx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sev supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sgx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hyperv supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='features'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>relaxed</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vapic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>spinlocks</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vpindex</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>runtime</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>synic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stimer</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reset</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vendor_id</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>frequencies</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reenlightenment</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tlbflush</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ipi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>avic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emsr_bitmap</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>xmm_input</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hyperv>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <launchSecurity supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.757 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 20 09:18:07 np0005588920 nova_compute[226886]: <domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <domain>kvm</domain>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <arch>x86_64</arch>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <vcpu max='4096'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <iothreads supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <os supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='firmware'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>efi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <loader supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>rom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pflash</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='readonly'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>yes</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='secure'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>yes</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>no</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </loader>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='maximumMigratable'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>on</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>off</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <vendor>AMD</vendor>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='succor'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <mode name='custom' supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ddpd-u'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sha512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm3'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sm4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Denverton-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amd-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='auto-ibrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='perfmon-v2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbpb'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='stibp-always-on'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='EPYC-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-128'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-256'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx10-512'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='prefetchiti'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Haswell-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512er'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512pf'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fma4'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tbm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xop'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='amx-tile'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-bf16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-fp16'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bitalg'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrc'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fzrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='la57'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='taa-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ifma'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cmpccxadd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fbsdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='fsrs'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ibrs-all'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='intel-psfd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='lam'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mcdt-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pbrsb-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='psdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='serialize'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vaes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='hle'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='rtm'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512bw'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512cd'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512dq'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512f'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='avx512vl'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='invpcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pcid'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='pku'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='mpx'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='core-capability'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='split-lock-detect'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='cldemote'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='erms'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='gfni'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdir64b'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='movdiri'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='xsaves'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='athlon-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='core2duo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='coreduo-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='n270-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='ss'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <blockers model='phenom-v1'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnow'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <feature name='3dnowext'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </blockers>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </mode>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <memoryBacking supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <enum name='sourceType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>anonymous</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <value>memfd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </memoryBacking>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <disk supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='diskDevice'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>disk</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cdrom</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>floppy</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>lun</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>fdc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>sata</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <graphics supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vnc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egl-headless</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <video supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='modelType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vga</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>cirrus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>none</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>bochs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ramfb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hostdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='mode'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>subsystem</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='startupPolicy'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>mandatory</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>requisite</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>optional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='subsysType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pci</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>scsi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='capsType'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='pciBackend'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hostdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <rng supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtio-non-transitional</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>random</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>egd</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <filesystem supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='driverType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>path</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>handle</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>virtiofs</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </filesystem>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tpm supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-tis</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tpm-crb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emulator</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>external</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendVersion'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>2.0</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </tpm>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <redirdev supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='bus'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>usb</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </redirdev>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <channel supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </channel>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <crypto supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendModel'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>builtin</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </crypto>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <interface supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='backendType'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>default</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>passt</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <panic supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='model'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>isa</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>hyperv</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </panic>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <console supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='type'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>null</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vc</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pty</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dev</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>file</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>pipe</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stdio</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>udp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tcp</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>unix</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>qemu-vdagent</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>dbus</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <gic supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <genid supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <backup supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <async-teardown supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <s390-pv supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <ps2 supported='yes'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <tdx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sev supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <sgx supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <hyperv supported='yes'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <enum name='features'>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>relaxed</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vapic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>spinlocks</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vpindex</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>runtime</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>synic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>stimer</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reset</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>vendor_id</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>frequencies</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>reenlightenment</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>tlbflush</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>ipi</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>avic</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>emsr_bitmap</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <value>xmm_input</value>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </enum>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      <defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:      </defaults>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    </hyperv>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:    <launchSecurity supported='no'/>
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </domainCapabilities>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.832 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.833 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.833 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.837 226890 INFO nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Secure Boot support detected#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.840 226890 INFO nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.840 226890 INFO nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.854 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 20 09:18:07 np0005588920 nova_compute[226886]:  <model>Nehalem</model>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: </cpu>
Jan 20 09:18:07 np0005588920 nova_compute[226886]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.857 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.899 226890 INFO nova.virt.node [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Determined node identity ff38e91c-3320-4831-90ac-bcffc89ba7b6 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.925 226890 WARNING nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Compute nodes ['ff38e91c-3320-4831-90ac-bcffc89ba7b6'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.970 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 20 09:18:07 np0005588920 systemd[1]: session-49.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588920 systemd[1]: session-49.scope: Consumed 2min 10.656s CPU time.
Jan 20 09:18:07 np0005588920 systemd-logind[783]: Session 49 logged out. Waiting for processes to exit.
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.994 226890 WARNING nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.994 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.994 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.994 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.995 226890 DEBUG nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:18:07 np0005588920 nova_compute[226886]: 2026-01-20 14:18:07.995 226890 DEBUG oslo_concurrency.processutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:07 np0005588920 systemd-logind[783]: Removed session 49.
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.425 226890 DEBUG oslo_concurrency.processutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.633 226890 WARNING nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.635 226890 DEBUG nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5247MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.636 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.636 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.655 226890 WARNING nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] No compute node record for compute-2.ctlplane.example.com:ff38e91c-3320-4831-90ac-bcffc89ba7b6: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ff38e91c-3320-4831-90ac-bcffc89ba7b6 could not be found.#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.677 226890 INFO nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: ff38e91c-3320-4831-90ac-bcffc89ba7b6#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.766 226890 DEBUG nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:18:08 np0005588920 nova_compute[226886]: 2026-01-20 14:18:08.766 226890 DEBUG nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:18:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:09.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:09.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.132 226890 INFO nova.scheduler.client.report [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [req-1a22181a-52a7-4f30-b406-a714a6fd60aa] Created resource provider record via placement API for resource provider with UUID ff38e91c-3320-4831-90ac-bcffc89ba7b6 and name compute-2.ctlplane.example.com.#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.154 226890 DEBUG oslo_concurrency.processutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1303255634' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.609 226890 DEBUG oslo_concurrency.processutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.617 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 20 09:18:10 np0005588920 nova_compute[226886]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.617 226890 INFO nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.619 226890 DEBUG nova.compute.provider_tree [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.620 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.625 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Libvirt baseline CPU <cpu>
Jan 20 09:18:10 np0005588920 nova_compute[226886]:  <arch>x86_64</arch>
Jan 20 09:18:10 np0005588920 nova_compute[226886]:  <model>Nehalem</model>
Jan 20 09:18:10 np0005588920 nova_compute[226886]:  <vendor>AMD</vendor>
Jan 20 09:18:10 np0005588920 nova_compute[226886]:  <topology sockets="8" cores="1" threads="1"/>
Jan 20 09:18:10 np0005588920 nova_compute[226886]: </cpu>
Jan 20 09:18:10 np0005588920 nova_compute[226886]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.731 226890 DEBUG nova.scheduler.client.report [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Updated inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.732 226890 DEBUG nova.compute.provider_tree [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Updating resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.733 226890 DEBUG nova.compute.provider_tree [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.838 226890 DEBUG nova.compute.provider_tree [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Updating resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.880 226890 DEBUG nova.compute.resource_tracker [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.880 226890 DEBUG oslo_concurrency.lockutils [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.881 226890 DEBUG nova.service [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.992 226890 DEBUG nova.service [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 20 09:18:10 np0005588920 nova_compute[226886]: 2026-01-20 14:18:10.993 226890 DEBUG nova.servicegroup.drivers.db [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 20 09:18:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:11.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:13 np0005588920 podman[227229]: 2026-01-20 14:18:13.060774322 +0000 UTC m=+0.142305096 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:18:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:13.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:13.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:15.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:15.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:18:16.420 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:18:16.421 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:18:16.421 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:17.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:17.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:19.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:19.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:21.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:23.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:18:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5422 writes, 22K keys, 5422 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5422 writes, 931 syncs, 5.82 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 458 writes, 686 keys, 458 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 458 writes, 225 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 20 09:18:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:23.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:25 np0005588920 podman[227308]: 2026-01-20 14:18:25.074793833 +0000 UTC m=+0.053786668 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 09:18:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:25.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:25.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:18:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:18:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:18:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:18:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:27.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:27.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:28 np0005588920 nova_compute[226886]: 2026-01-20 14:18:28.995 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:18:29 np0005588920 nova_compute[226886]: 2026-01-20 14:18:29.230 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:18:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:29.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:29.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:31.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:33.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:35.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:35.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:37.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:37.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:39.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:39.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:41.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:41.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:43.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:43.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:44 np0005588920 podman[227581]: 2026-01-20 14:18:44.057142044 +0000 UTC m=+0.139744393 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:18:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:45.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:45.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:47.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:47.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:49.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:49.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:51.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:51.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:53.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:55.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:55.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:56 np0005588920 podman[227608]: 2026-01-20 14:18:56.025718227 +0000 UTC m=+0.110250050 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:18:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:57.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:59.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:18:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:59.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:01.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:03.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:03.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:05.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:06 np0005588920 nova_compute[226886]: 2026-01-20 14:19:06.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:06 np0005588920 nova_compute[226886]: 2026-01-20 14:19:06.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:06 np0005588920 nova_compute[226886]: 2026-01-20 14:19:06.728 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:19:06 np0005588920 nova_compute[226886]: 2026-01-20 14:19:06.728 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.047 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.047 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.048 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.048 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.049 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.049 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.050 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.050 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.050 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.228 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.229 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.229 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.229 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.230 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:19:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:07.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:07.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:19:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/564146264' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.709 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.958 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.960 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5283MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.960 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:07 np0005588920 nova_compute[226886]: 2026-01-20 14:19:07.961 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:09 np0005588920 nova_compute[226886]: 2026-01-20 14:19:09.290 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:19:09 np0005588920 nova_compute[226886]: 2026-01-20 14:19:09.290 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:19:09 np0005588920 nova_compute[226886]: 2026-01-20 14:19:09.422 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:19:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:09.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:09.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:19:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2626967478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:19:09 np0005588920 nova_compute[226886]: 2026-01-20 14:19:09.889 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:19:09 np0005588920 nova_compute[226886]: 2026-01-20 14:19:09.899 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:19:10 np0005588920 nova_compute[226886]: 2026-01-20 14:19:10.137 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:19:10 np0005588920 nova_compute[226886]: 2026-01-20 14:19:10.140 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:19:10 np0005588920 nova_compute[226886]: 2026-01-20 14:19:10.141 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:11.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:13.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:13.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.104267) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754104666, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2304, "num_deletes": 251, "total_data_size": 5757540, "memory_usage": 5836488, "flush_reason": "Manual Compaction"}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754131530, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3767487, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17832, "largest_seqno": 20131, "table_properties": {"data_size": 3758175, "index_size": 5870, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18600, "raw_average_key_size": 20, "raw_value_size": 3739661, "raw_average_value_size": 4029, "num_data_blocks": 262, "num_entries": 928, "num_filter_entries": 928, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918525, "oldest_key_time": 1768918525, "file_creation_time": 1768918754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 27330 microseconds, and 15219 cpu microseconds.
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.131604) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3767487 bytes OK
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.131632) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.138882) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.138908) EVENT_LOG_v1 {"time_micros": 1768918754138901, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.138934) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 5747533, prev total WAL file size 5747533, number of live WAL files 2.
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.141807) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3679KB)], [36(7520KB)]
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754141949, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11468835, "oldest_snapshot_seqno": -1}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4503 keys, 9395579 bytes, temperature: kUnknown
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754273162, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9395579, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9363427, "index_size": 19818, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11269, "raw_key_size": 112759, "raw_average_key_size": 25, "raw_value_size": 9279732, "raw_average_value_size": 2060, "num_data_blocks": 822, "num_entries": 4503, "num_filter_entries": 4503, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768918754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.273800) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9395579 bytes
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.275626) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.1 rd, 71.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 7.3 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 5022, records dropped: 519 output_compression: NoCompression
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.275658) EVENT_LOG_v1 {"time_micros": 1768918754275643, "job": 20, "event": "compaction_finished", "compaction_time_micros": 131629, "compaction_time_cpu_micros": 39692, "output_level": 6, "num_output_files": 1, "total_output_size": 9395579, "num_input_records": 5022, "num_output_records": 4503, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754277127, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754279841, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.141519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.280057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.280077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.280080) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.280083) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:19:14.280085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:15 np0005588920 podman[227671]: 2026-01-20 14:19:15.073253073 +0000 UTC m=+0.105291694 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:19:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:15.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:15.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:19:16.420 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:19:16.421 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:19:16.421 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:17.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:17.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:19.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:19.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:21.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:21.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:23.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:23.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:25.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:25.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:27 np0005588920 podman[227697]: 2026-01-20 14:19:27.000948633 +0000 UTC m=+0.081426428 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:19:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:27.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:27.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:29.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:29.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:31.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:31.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:33.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:33.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:19:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:19:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:19:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:36 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:38 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:40.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:40 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:42.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:42 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:42.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:44.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:44 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:44.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:46 np0005588920 podman[227899]: 2026-01-20 14:19:46.016837702 +0000 UTC m=+0.103230024 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 09:19:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:46 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:48 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:50 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:50.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:50.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:52.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:52.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:54.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:54.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:56.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:56.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:57 np0005588920 podman[227925]: 2026-01-20 14:19:57.987613023 +0000 UTC m=+0.069813905 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:19:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:19:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 09:19:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:58.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:58 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:58.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:20:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:00.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:00.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:02.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:02.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:04.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:04.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:06.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:06.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.135 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.135 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.164 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.164 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.165 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.181 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.182 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.183 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.183 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.184 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.186 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.186 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.223 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.223 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.224 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.224 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.225 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:20:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:20:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4055056685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.680 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:20:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:10.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.893 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.894 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5316MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.894 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.895 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.964 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.964 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:20:10 np0005588920 nova_compute[226886]: 2026-01-20 14:20:10.980 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:20:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:20:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/90274225' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:20:11 np0005588920 nova_compute[226886]: 2026-01-20 14:20:11.471 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:20:11 np0005588920 nova_compute[226886]: 2026-01-20 14:20:11.476 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:20:11 np0005588920 nova_compute[226886]: 2026-01-20 14:20:11.738 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:20:11 np0005588920 nova_compute[226886]: 2026-01-20 14:20:11.740 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:20:11 np0005588920 nova_compute[226886]: 2026-01-20 14:20:11.740 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:20:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:12.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:14.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:14.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:16.422 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:16.422 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:16.422 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:20:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:16.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:17 np0005588920 podman[227989]: 2026-01-20 14:20:17.013873987 +0000 UTC m=+0.098446897 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:20:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:18.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:18.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:19.156 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:19.159 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:20:19.160 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:20:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:20.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:20.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:22.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:22.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:24.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:24.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:26.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:26.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:28.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:28.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:29 np0005588920 podman[228017]: 2026-01-20 14:20:29.013608668 +0000 UTC m=+0.087509483 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:20:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:30.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:30.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:32.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:20:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:32.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:35 np0005588920 nova_compute[226886]: 2026-01-20 14:26:35.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:35 np0005588920 rsyslogd[1004]: imjournal: 4970 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 20 09:26:36 np0005588920 nova_compute[226886]: 2026-01-20 14:26:36.023 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:36.022 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:26:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:36.025 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:26:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:36.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:36.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:38.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:38.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:38 np0005588920 nova_compute[226886]: 2026-01-20 14:26:38.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:38 np0005588920 nova_compute[226886]: 2026-01-20 14:26:38.807 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:40.029 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:26:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:40.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:40.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:40 np0005588920 nova_compute[226886]: 2026-01-20 14:26:40.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:42 np0005588920 podman[233323]: 2026-01-20 14:26:42.020964313 +0000 UTC m=+0.102428205 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 20 09:26:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:42.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:42.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 20 09:26:43 np0005588920 nova_compute[226886]: 2026-01-20 14:26:43.252 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:44.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:44.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:45 np0005588920 nova_compute[226886]: 2026-01-20 14:26:45.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:46.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.246 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Acquiring lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.247 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.248 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Acquiring lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.248 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.249 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.251 226890 INFO nova.compute.manager [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Terminating instance#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.254 226890 DEBUG nova.compute.manager [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:26:47 np0005588920 kernel: tap9a01b377-c1 (unregistering): left promiscuous mode
Jan 20 09:26:47 np0005588920 NetworkManager[49076]: <info>  [1768919207.3214] device (tap9a01b377-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:26:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:26:47Z|00079|binding|INFO|Releasing lport 9a01b377-c198-496f-986c-48e774f33c12 from this chassis (sb_readonly=0)
Jan 20 09:26:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:26:47Z|00080|binding|INFO|Setting lport 9a01b377-c198-496f-986c-48e774f33c12 down in Southbound
Jan 20 09:26:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:26:47Z|00081|binding|INFO|Removing iface tap9a01b377-c1 ovn-installed in OVS
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.334 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.364 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:5e:41 10.100.0.4'], port_security=['fa:16:3e:dc:5e:41 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5ee5ed18-cd6a-421b-874a-94127d9d2e22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d67e270-6232-44c0-a859-2ab75934074d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef783a3b5dd3446faf947d627c64c5da', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c69a4378-df75-46a8-a919-3917216182c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a00b20ec-8436-4c1b-b8fb-9d59f661148c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=9a01b377-c198-496f-986c-48e774f33c12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.365 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 9a01b377-c198-496f-986c-48e774f33c12 in datapath 4d67e270-6232-44c0-a859-2ab75934074d unbound from our chassis#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.366 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d67e270-6232-44c0-a859-2ab75934074d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.367 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd00c6e-68a7-4fb9-ac85-ad46626b352c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.368 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d namespace which is not needed anymore#033[00m
Jan 20 09:26:47 np0005588920 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 20 09:26:47 np0005588920 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Consumed 16.938s CPU time.
Jan 20 09:26:47 np0005588920 systemd-machined[196121]: Machine qemu-6-instance-0000000f terminated.
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.489 226890 INFO nova.virt.libvirt.driver [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Instance destroyed successfully.#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.490 226890 DEBUG nova.objects.instance [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lazy-loading 'resources' on Instance uuid 5ee5ed18-cd6a-421b-874a-94127d9d2e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:47 np0005588920 neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d[233029]: [NOTICE]   (233059) : haproxy version is 2.8.14-c23fe91
Jan 20 09:26:47 np0005588920 neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d[233029]: [NOTICE]   (233059) : path to executable is /usr/sbin/haproxy
Jan 20 09:26:47 np0005588920 neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d[233029]: [WARNING]  (233059) : Exiting Master process...
Jan 20 09:26:47 np0005588920 neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d[233029]: [ALERT]    (233059) : Current worker (233064) exited with code 143 (Terminated)
Jan 20 09:26:47 np0005588920 neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d[233029]: [WARNING]  (233059) : All workers exited. Exiting... (0)
Jan 20 09:26:47 np0005588920 systemd[1]: libpod-4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5.scope: Deactivated successfully.
Jan 20 09:26:47 np0005588920 podman[233417]: 2026-01-20 14:26:47.521707568 +0000 UTC m=+0.052111969 container died 4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:26:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5-userdata-shm.mount: Deactivated successfully.
Jan 20 09:26:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay-039a9a412bfe3b6412b72abcb5aa20c70230b01f091b12e94c491fc153a52b36-merged.mount: Deactivated successfully.
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.557 226890 DEBUG nova.virt.libvirt.vif [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:25:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1137037024',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1137037024',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(17),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1137037024',id=15,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=17,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPnZ/Cx8vMumXGvEI9547JEyMeRkGznqLk5Xz3oR+TXmoMMxw6ZcUZJSSPx9PRS1PfeH2my6tZBX8mJSWH6Q1mhQIN/hiJECzeN4ewqe8NWMqXUqY2ux8nHjNnGnzhLaeQ==',key_name='tempest-keypair-660220900',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:26:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef783a3b5dd3446faf947d627c64c5da',ramdisk_id='',reservation_id='r-0k04wrok',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-2064998848',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-2064998848-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:26:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='57d58248fa3b44579c14396dca4a2199',uuid=5ee5ed18-cd6a-421b-874a-94127d9d2e22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a01b377-c198-496f-986c-48e774f33c12", "address": "fa:16:3e:dc:5e:41", "network": {"id": "4d67e270-6232-44c0-a859-2ab75934074d", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1442825192-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef783a3b5dd3446faf947d627c64c5da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a01b377-c1", "ovs_interfaceid": "9a01b377-c198-496f-986c-48e774f33c12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.557 226890 DEBUG nova.network.os_vif_util [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Converting VIF {"id": "9a01b377-c198-496f-986c-48e774f33c12", "address": "fa:16:3e:dc:5e:41", "network": {"id": "4d67e270-6232-44c0-a859-2ab75934074d", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1442825192-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef783a3b5dd3446faf947d627c64c5da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a01b377-c1", "ovs_interfaceid": "9a01b377-c198-496f-986c-48e774f33c12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:26:47 np0005588920 podman[233417]: 2026-01-20 14:26:47.557983396 +0000 UTC m=+0.088387787 container cleanup 4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.558 226890 DEBUG nova.network.os_vif_util [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:5e:41,bridge_name='br-int',has_traffic_filtering=True,id=9a01b377-c198-496f-986c-48e774f33c12,network=Network(4d67e270-6232-44c0-a859-2ab75934074d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a01b377-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.559 226890 DEBUG os_vif [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:5e:41,bridge_name='br-int',has_traffic_filtering=True,id=9a01b377-c198-496f-986c-48e774f33c12,network=Network(4d67e270-6232-44c0-a859-2ab75934074d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a01b377-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.561 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a01b377-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.567 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.569 226890 INFO os_vif [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:5e:41,bridge_name='br-int',has_traffic_filtering=True,id=9a01b377-c198-496f-986c-48e774f33c12,network=Network(4d67e270-6232-44c0-a859-2ab75934074d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a01b377-c1')#033[00m
Jan 20 09:26:47 np0005588920 systemd[1]: libpod-conmon-4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5.scope: Deactivated successfully.
Jan 20 09:26:47 np0005588920 podman[233457]: 2026-01-20 14:26:47.621405034 +0000 UTC m=+0.041678563 container remove 4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.627 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ceef29f3-0b2e-4dfb-9c33-22800ee9292d]: (4, ('Tue Jan 20 02:26:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d (4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5)\n4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5\nTue Jan 20 02:26:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d (4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5)\n4d29830badc53b0b440294d74f331f80303c7b0cf02a700ea9d811b63c8deac5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.629 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe05c78-de5b-4ad9-a123-4bcea751c249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.630 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d67e270-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.631 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 kernel: tap4d67e270-60: left promiscuous mode
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.659 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 nova_compute[226886]: 2026-01-20 14:26:47.661 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.662 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a578248a-750c-4268-acf6-6c9a3bce39ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.680 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6f6597-b3ab-44f0-bcef-6ea412cb247f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.682 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c10e12a5-d3c6-4def-9edb-88a7f9534dbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.703 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26b032fd-b1cd-4f42-98d1-ed7cd371360d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420922, 'reachable_time': 29153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233490, 'error': None, 'target': 'ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.705 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d67e270-6232-44c0-a859-2ab75934074d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:26:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:26:47.705 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[d8777518-1cb0-4ed3-bfa2-a7cd2f1678d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:26:47 np0005588920 systemd[1]: run-netns-ovnmeta\x2d4d67e270\x2d6232\x2d44c0\x2da859\x2d2ab75934074d.mount: Deactivated successfully.
Jan 20 09:26:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:26:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:48.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:26:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:48.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.573 226890 DEBUG nova.compute.manager [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Received event network-vif-unplugged-9a01b377-c198-496f-986c-48e774f33c12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.574 226890 DEBUG oslo_concurrency.lockutils [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.574 226890 DEBUG oslo_concurrency.lockutils [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.575 226890 DEBUG oslo_concurrency.lockutils [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.575 226890 DEBUG nova.compute.manager [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] No waiting events found dispatching network-vif-unplugged-9a01b377-c198-496f-986c-48e774f33c12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.576 226890 DEBUG nova.compute.manager [req-0f4911f3-3003-455f-ae37-73772ec72965 req-fd0ad2ea-7bdb-4098-bc1e-d5896d7de50b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Received event network-vif-unplugged-9a01b377-c198-496f-986c-48e774f33c12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.837 226890 INFO nova.virt.libvirt.driver [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Deleting instance files /var/lib/nova/instances/5ee5ed18-cd6a-421b-874a-94127d9d2e22_del#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.838 226890 INFO nova.virt.libvirt.driver [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Deletion of /var/lib/nova/instances/5ee5ed18-cd6a-421b-874a-94127d9d2e22_del complete#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.921 226890 INFO nova.compute.manager [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Took 1.67 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.921 226890 DEBUG oslo.service.loopingcall [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.922 226890 DEBUG nova.compute.manager [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:26:48 np0005588920 nova_compute[226886]: 2026-01-20 14:26:48.922 226890 DEBUG nova.network.neutron [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:26:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:50.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:50.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.762 226890 DEBUG nova.compute.manager [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Received event network-vif-plugged-9a01b377-c198-496f-986c-48e774f33c12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.764 226890 DEBUG oslo_concurrency.lockutils [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.764 226890 DEBUG oslo_concurrency.lockutils [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.765 226890 DEBUG oslo_concurrency.lockutils [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.766 226890 DEBUG nova.compute.manager [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] No waiting events found dispatching network-vif-plugged-9a01b377-c198-496f-986c-48e774f33c12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.767 226890 WARNING nova.compute.manager [req-101bea90-313a-433b-9ac0-6f6e6e990793 req-99fa7323-1ffb-431a-924f-71661b93042c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Received unexpected event network-vif-plugged-9a01b377-c198-496f-986c-48e774f33c12 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:26:50 np0005588920 nova_compute[226886]: 2026-01-20 14:26:50.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.772 226890 DEBUG nova.compute.manager [req-144fbea7-239a-47ce-9732-55606c7f5687 req-affb604a-564a-48f2-ba59-8a3579ac6cbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Received event network-vif-deleted-9a01b377-c198-496f-986c-48e774f33c12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.772 226890 INFO nova.compute.manager [req-144fbea7-239a-47ce-9732-55606c7f5687 req-affb604a-564a-48f2-ba59-8a3579ac6cbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Neutron deleted interface 9a01b377-c198-496f-986c-48e774f33c12; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.772 226890 DEBUG nova.network.neutron [req-144fbea7-239a-47ce-9732-55606c7f5687 req-affb604a-564a-48f2-ba59-8a3579ac6cbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.775 226890 DEBUG nova.network.neutron [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.816 226890 INFO nova.compute.manager [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Took 2.89 seconds to deallocate network for instance.#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.826 226890 DEBUG nova.compute.manager [req-144fbea7-239a-47ce-9732-55606c7f5687 req-affb604a-564a-48f2-ba59-8a3579ac6cbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Detach interface failed, port_id=9a01b377-c198-496f-986c-48e774f33c12, reason: Instance 5ee5ed18-cd6a-421b-874a-94127d9d2e22 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.890 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.890 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:51 np0005588920 nova_compute[226886]: 2026-01-20 14:26:51.952 226890 DEBUG oslo_concurrency.processutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:52.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:52.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:26:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1900977882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.400 226890 DEBUG oslo_concurrency.processutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.410 226890 DEBUG nova.compute.provider_tree [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.430 226890 DEBUG nova.scheduler.client.report [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.452 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.479 226890 INFO nova.scheduler.client.report [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Deleted allocations for instance 5ee5ed18-cd6a-421b-874a-94127d9d2e22#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.564 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:52 np0005588920 nova_compute[226886]: 2026-01-20 14:26:52.576 226890 DEBUG oslo_concurrency.lockutils [None req-bd27b6f9-cad2-43f5-9d8d-aefd8bedfd4c 57d58248fa3b44579c14396dca4a2199 ef783a3b5dd3446faf947d627c64c5da - - default default] Lock "5ee5ed18-cd6a-421b-874a-94127d9d2e22" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:53 np0005588920 nova_compute[226886]: 2026-01-20 14:26:53.286 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:54.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:54.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:26:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:56.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:26:57 np0005588920 nova_compute[226886]: 2026-01-20 14:26:57.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:58.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:26:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:58.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:58 np0005588920 nova_compute[226886]: 2026-01-20 14:26:58.289 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:00 np0005588920 podman[233515]: 2026-01-20 14:27:00.031445338 +0000 UTC m=+0.115177186 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:27:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:00.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:00.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:00 np0005588920 nova_compute[226886]: 2026-01-20 14:27:00.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.140 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:02.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:02.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.488 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919207.4876492, 5ee5ed18-cd6a-421b-874a-94127d9d2e22 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.489 226890 INFO nova.compute.manager [-] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.524 226890 DEBUG nova.compute.manager [None req-6438a231-e949-4231-989f-942d80b4a67a - - - - - -] [instance: 5ee5ed18-cd6a-421b-874a-94127d9d2e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:02 np0005588920 nova_compute[226886]: 2026-01-20 14:27:02.571 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:03 np0005588920 nova_compute[226886]: 2026-01-20 14:27:03.291 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:04.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:04.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:06.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:06.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:07 np0005588920 nova_compute[226886]: 2026-01-20 14:27:07.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:08.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:08.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:08 np0005588920 nova_compute[226886]: 2026-01-20 14:27:08.293 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:10.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:10.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:12.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:12.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:12 np0005588920 nova_compute[226886]: 2026-01-20 14:27:12.579 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:12 np0005588920 podman[233543]: 2026-01-20 14:27:12.99338888 +0000 UTC m=+0.078229499 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.240 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.240 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.241 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.259 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.260 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.260 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.261 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588920 nova_compute[226886]: 2026-01-20 14:27:13.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:14.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:14.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:14 np0005588920 nova_compute[226886]: 2026-01-20 14:27:14.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:16.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:16.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:16.429 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:16.429 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:16.430 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:16 np0005588920 nova_compute[226886]: 2026-01-20 14:27:16.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.528 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.529 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.553 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.832 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.833 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.834 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.834 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.835 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.887 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.889 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.897 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:27:17 np0005588920 nova_compute[226886]: 2026-01-20 14:27:17.897 226890 INFO nova.compute.claims [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.005 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:18.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:18.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/338726957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.357 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/499083521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.487 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.496 226890 DEBUG nova.compute.provider_tree [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.548 226890 DEBUG nova.scheduler.client.report [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.625 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.626 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.650 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.651 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.672 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.673 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4885MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.673 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.673 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.698 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.727 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.728 226890 DEBUG nova.network.neutron [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.792 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 21cf2820-8f37-488a-ae75-3f8f45d6ba81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.852 226890 INFO nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.888 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.900 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 665a5c82-dfcf-43d5-9d42-1f3046305b5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.900 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.901 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.906 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:27:18 np0005588920 nova_compute[226886]: 2026-01-20 14:27:18.969 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.087 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.089 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.090 226890 INFO nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Creating image(s)#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.125 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.166 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.202 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.207 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.240 226890 DEBUG nova.network.neutron [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.241 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.299 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.301 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.303 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.304 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.354 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.367 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3505650960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.451 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.456 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.475 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.499 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.500 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.501 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.509 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.510 226890 INFO nova.compute.claims [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:27:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.786 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:19 np0005588920 nova_compute[226886]: 2026-01-20 14:27:19.979 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.612s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.061 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] resizing rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:27:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:20.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.192 226890 DEBUG nova.objects.instance [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'migration_context' on Instance uuid 21cf2820-8f37-488a-ae75-3f8f45d6ba81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1706451896' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.207 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.208 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Ensure instance console log exists: /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.209 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.209 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.210 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.213 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.219 226890 WARNING nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.221 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.229 226890 DEBUG nova.compute.provider_tree [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.232 226890 DEBUG nova.virt.libvirt.host [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.233 226890 DEBUG nova.virt.libvirt.host [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.240 226890 DEBUG nova.virt.libvirt.host [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.241 226890 DEBUG nova.virt.libvirt.host [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.242 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.242 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.243 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.243 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.243 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.243 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.244 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.244 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.244 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.244 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.245 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.245 226890 DEBUG nova.virt.hardware [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.247 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.266 226890 DEBUG nova.scheduler.client.report [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:20.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.307 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.308 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.442 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.443 226890 DEBUG nova.network.neutron [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.461 226890 INFO nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.501 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.564 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:27:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/697472392' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.653 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.654 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.655 226890 INFO nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Creating image(s)#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.679 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.707 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.738 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.741 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.827 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.829 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.831 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.832 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.872 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.876 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.907 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.660s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.949 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:20 np0005588920 nova_compute[226886]: 2026-01-20 14:27:20.956 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:21 np0005588920 nova_compute[226886]: 2026-01-20 14:27:21.276 226890 DEBUG nova.network.neutron [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:27:21 np0005588920 nova_compute[226886]: 2026-01-20 14:27:21.277 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:27:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1274305540' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:21 np0005588920 nova_compute[226886]: 2026-01-20 14:27:21.946 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.990s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:21 np0005588920 nova_compute[226886]: 2026-01-20 14:27:21.949 226890 DEBUG nova.objects.instance [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21cf2820-8f37-488a-ae75-3f8f45d6ba81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:21 np0005588920 nova_compute[226886]: 2026-01-20 14:27:21.969 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <uuid>21cf2820-8f37-488a-ae75-3f8f45d6ba81</uuid>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <name>instance-00000011</name>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:name>tempest-LiveMigrationNegativeTest-server-1937781662</nova:name>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:27:20</nova:creationTime>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:user uuid="399cc9abe2cd4ab196a4e5789992ae51">tempest-LiveMigrationNegativeTest-1807701797-project-member</nova:user>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <nova:project uuid="1759b9d61ad946b6afa3e8448ce02190">tempest-LiveMigrationNegativeTest-1807701797</nova:project>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="serial">21cf2820-8f37-488a-ae75-3f8f45d6ba81</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="uuid">21cf2820-8f37-488a-ae75-3f8f45d6ba81</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/console.log" append="off"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:27:21 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:27:21 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:27:21 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:27:21 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.037 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.038 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.038 226890 INFO nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Using config drive#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.084 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:22.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:22.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.333 226890 INFO nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Creating config drive at /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.342 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp387_bf1_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.472 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp387_bf1_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.507 226890 DEBUG nova.storage.rbd_utils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.510 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:22 np0005588920 nova_compute[226886]: 2026-01-20 14:27:22.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.072 226890 DEBUG oslo_concurrency.processutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config 21cf2820-8f37-488a-ae75-3f8f45d6ba81_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.073 226890 INFO nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Deleting local config drive /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81/disk.config because it was imported into RBD.#033[00m
Jan 20 09:27:23 np0005588920 systemd-machined[196121]: New machine qemu-7-instance-00000011.
Jan 20 09:27:23 np0005588920 systemd[1]: Started Virtual Machine qemu-7-instance-00000011.
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.298 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.352 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.460 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] resizing rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.936 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919243.935668, 21cf2820-8f37-488a-ae75-3f8f45d6ba81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.938 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.940 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.941 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.944 226890 INFO nova.virt.libvirt.driver [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance spawned successfully.#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.944 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.967 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.968 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.969 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.969 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.969 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.970 226890 DEBUG nova.virt.libvirt.driver [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.973 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:23 np0005588920 nova_compute[226886]: 2026-01-20 14:27:23.975 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.013 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.013 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919243.937493, 21cf2820-8f37-488a-ae75-3f8f45d6ba81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.013 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] VM Started (Lifecycle Event)#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.037 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.039 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.060 226890 INFO nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Took 4.97 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.061 226890 DEBUG nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.070 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.126 226890 INFO nova.compute.manager [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Took 6.30 seconds to build instance.#033[00m
Jan 20 09:27:24 np0005588920 nova_compute[226886]: 2026-01-20 14:27:24.148 226890 DEBUG oslo_concurrency.lockutils [None req-bc833227-384a-4cc7-ad15-39d153d82c3e 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:24.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:24.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.320 226890 DEBUG nova.objects.instance [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lazy-loading 'migration_context' on Instance uuid 665a5c82-dfcf-43d5-9d42-1f3046305b5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.386 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.387 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Ensure instance console log exists: /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.388 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.388 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.389 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.392 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.397 226890 WARNING nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.408 226890 DEBUG nova.virt.libvirt.host [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.409 226890 DEBUG nova.virt.libvirt.host [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.413 226890 DEBUG nova.virt.libvirt.host [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.414 226890 DEBUG nova.virt.libvirt.host [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.416 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.417 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.418 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.418 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.419 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.420 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.420 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.421 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.422 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.422 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.423 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.423 226890 DEBUG nova.virt.hardware [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.429 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1257689580' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.874 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.912 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:25 np0005588920 nova_compute[226886]: 2026-01-20 14:27:25.918 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:26.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:26.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3520391376' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.403 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.405 226890 DEBUG nova.objects.instance [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 665a5c82-dfcf-43d5-9d42-1f3046305b5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.494 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <uuid>665a5c82-dfcf-43d5-9d42-1f3046305b5b</uuid>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <name>instance-00000012</name>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-568479015</nova:name>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:27:25</nova:creationTime>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:user uuid="7dfe92a3bd9c43bbb3d4590ff5a57173">tempest-DeleteServersAdminTestJSON-834962290-project-member</nova:user>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <nova:project uuid="e3cc4a74e92f45a1811137175eae3af2">tempest-DeleteServersAdminTestJSON-834962290</nova:project>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="serial">665a5c82-dfcf-43d5-9d42-1f3046305b5b</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="uuid">665a5c82-dfcf-43d5-9d42-1f3046305b5b</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/console.log" append="off"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:27:26 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:27:26 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:27:26 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:27:26 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.658 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.658 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.659 226890 INFO nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Using config drive#033[00m
Jan 20 09:27:26 np0005588920 nova_compute[226886]: 2026-01-20 14:27:26.685 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.279 226890 INFO nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Creating config drive at /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.289 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxjgpexf9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.419 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxjgpexf9" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.464 226890 DEBUG nova.storage.rbd_utils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.470 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:27 np0005588920 nova_compute[226886]: 2026-01-20 14:27:27.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:28.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:28.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:28 np0005588920 nova_compute[226886]: 2026-01-20 14:27:28.300 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:28 np0005588920 nova_compute[226886]: 2026-01-20 14:27:28.542 226890 DEBUG oslo_concurrency.processutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config 665a5c82-dfcf-43d5-9d42-1f3046305b5b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:28 np0005588920 nova_compute[226886]: 2026-01-20 14:27:28.544 226890 INFO nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Deleting local config drive /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:27:28 np0005588920 systemd-machined[196121]: New machine qemu-8-instance-00000012.
Jan 20 09:27:28 np0005588920 systemd[1]: Started Virtual Machine qemu-8-instance-00000012.
Jan 20 09:27:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.612 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919249.6119936, 665a5c82-dfcf-43d5-9d42-1f3046305b5b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.612 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.615 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.615 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.619 226890 INFO nova.virt.libvirt.driver [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance spawned successfully.#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.619 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.635 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.642 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.647 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.648 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.648 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.649 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.650 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.650 226890 DEBUG nova.virt.libvirt.driver [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.698 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.699 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919249.6147325, 665a5c82-dfcf-43d5-9d42-1f3046305b5b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.699 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.728 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.732 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.763 226890 INFO nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Took 9.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.764 226890 DEBUG nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.765 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.846 226890 INFO nova.compute.manager [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Took 10.99 seconds to build instance.#033[00m
Jan 20 09:27:29 np0005588920 nova_compute[226886]: 2026-01-20 14:27:29.918 226890 DEBUG oslo_concurrency.lockutils [None req-24c814e0-9551-4463-ae2e-f8ad629fc353 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:30.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.833 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Acquiring lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.834 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.835 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Acquiring lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.835 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.835 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.837 226890 INFO nova.compute.manager [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Terminating instance#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.838 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Acquiring lock "refresh_cache-665a5c82-dfcf-43d5-9d42-1f3046305b5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.838 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Acquired lock "refresh_cache-665a5c82-dfcf-43d5-9d42-1f3046305b5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.838 226890 DEBUG nova.network.neutron [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:30 np0005588920 nova_compute[226886]: 2026-01-20 14:27:30.986 226890 DEBUG nova.network.neutron [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:31 np0005588920 podman[234338]: 2026-01-20 14:27:31.066541763 +0000 UTC m=+0.148355417 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:27:31 np0005588920 nova_compute[226886]: 2026-01-20 14:27:31.273 226890 DEBUG nova.network.neutron [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:31 np0005588920 nova_compute[226886]: 2026-01-20 14:27:31.374 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Releasing lock "refresh_cache-665a5c82-dfcf-43d5-9d42-1f3046305b5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:31 np0005588920 nova_compute[226886]: 2026-01-20 14:27:31.375 226890 DEBUG nova.compute.manager [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:27:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:32.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:32.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:32 np0005588920 nova_compute[226886]: 2026-01-20 14:27:32.596 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:32 np0005588920 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 20 09:27:32 np0005588920 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Consumed 2.248s CPU time.
Jan 20 09:27:32 np0005588920 systemd-machined[196121]: Machine qemu-8-instance-00000012 terminated.
Jan 20 09:27:33 np0005588920 nova_compute[226886]: 2026-01-20 14:27:33.004 226890 INFO nova.virt.libvirt.driver [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance destroyed successfully.#033[00m
Jan 20 09:27:33 np0005588920 nova_compute[226886]: 2026-01-20 14:27:33.005 226890 DEBUG nova.objects.instance [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lazy-loading 'resources' on Instance uuid 665a5c82-dfcf-43d5-9d42-1f3046305b5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:33 np0005588920 nova_compute[226886]: 2026-01-20 14:27:33.303 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3043623211' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:34.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:34.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:36.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:36.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 20 09:27:37 np0005588920 nova_compute[226886]: 2026-01-20 14:27:37.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.038 226890 INFO nova.virt.libvirt.driver [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Deleting instance files /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b_del#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.039 226890 INFO nova.virt.libvirt.driver [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Deletion of /var/lib/nova/instances/665a5c82-dfcf-43d5-9d42-1f3046305b5b_del complete#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.123 226890 INFO nova.compute.manager [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Took 6.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.124 226890 DEBUG oslo.service.loopingcall [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.124 226890 DEBUG nova.compute.manager [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.124 226890 DEBUG nova.network.neutron [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:27:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:38.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.259 226890 DEBUG nova.network.neutron [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.274 226890 DEBUG nova.network.neutron [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:38.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.317 226890 INFO nova.compute.manager [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Took 0.19 seconds to deallocate network for instance.#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.394 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.394 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.456 226890 DEBUG oslo_concurrency.processutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:38.571 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:27:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:38.572 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:27:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:27:38.573 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.619 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/817646247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.938 226890 DEBUG oslo_concurrency.processutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.945 226890 DEBUG nova.compute.provider_tree [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.967 226890 DEBUG nova.scheduler.client.report [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:38 np0005588920 nova_compute[226886]: 2026-01-20 14:27:38.995 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:39 np0005588920 nova_compute[226886]: 2026-01-20 14:27:39.032 226890 INFO nova.scheduler.client.report [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Deleted allocations for instance 665a5c82-dfcf-43d5-9d42-1f3046305b5b#033[00m
Jan 20 09:27:39 np0005588920 nova_compute[226886]: 2026-01-20 14:27:39.169 226890 DEBUG oslo_concurrency.lockutils [None req-dab33856-0072-43f2-8d81-9944a833af9a fa0d918445034ec398a8a856d02803b8 bf17e8c0297a4d0bbeb73dc9125b2f08 - - default default] Lock "665a5c82-dfcf-43d5-9d42-1f3046305b5b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:40.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:40.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:27:41Z|00082|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.054 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.055 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.077 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.152 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.153 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.161 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.161 226890 INFO nova.compute.claims [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:27:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:42.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.286 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:42.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.602 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2865709230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.703 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.711 226890 DEBUG nova.compute.provider_tree [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.730 226890 DEBUG nova.scheduler.client.report [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.759 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.760 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.867 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.868 226890 DEBUG nova.network.neutron [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.904 226890 INFO nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:27:42 np0005588920 nova_compute[226886]: 2026-01-20 14:27:42.930 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.075 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.077 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.078 226890 INFO nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Creating image(s)#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.106 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.138 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.169 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.173 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.249 226890 DEBUG nova.network.neutron [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.250 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.260 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.261 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.262 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.262 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.286 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.290 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.590 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.300s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.701 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] resizing rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.820 226890 DEBUG nova.objects.instance [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lazy-loading 'migration_context' on Instance uuid 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.970 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.971 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Ensure instance console log exists: /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.971 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.972 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.972 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.975 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.981 226890 WARNING nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.987 226890 DEBUG nova.virt.libvirt.host [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.988 226890 DEBUG nova.virt.libvirt.host [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.991 226890 DEBUG nova.virt.libvirt.host [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.992 226890 DEBUG nova.virt.libvirt.host [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.995 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.995 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.996 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.997 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.997 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.997 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.998 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.998 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.999 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:27:43 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.999 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:43.999 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.000 226890 DEBUG nova.virt.hardware [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.005 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:44 np0005588920 podman[234596]: 2026-01-20 14:27:44.011810741 +0000 UTC m=+0.088229322 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:27:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:44.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:44.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4262737880' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.505 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.543 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.547 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.962 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.963 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.963 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.963 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.964 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.965 226890 INFO nova.compute.manager [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Terminating instance#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.966 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "refresh_cache-21cf2820-8f37-488a-ae75-3f8f45d6ba81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.966 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquired lock "refresh_cache-21cf2820-8f37-488a-ae75-3f8f45d6ba81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.966 226890 DEBUG nova.network.neutron [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/603074013' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.989 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:44 np0005588920 nova_compute[226886]: 2026-01-20 14:27:44.990 226890 DEBUG nova.objects.instance [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.003 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <uuid>8aa2f978-353e-4f81-bd81-2c446cf4f5f6</uuid>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <name>instance-00000014</name>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-1487733165</nova:name>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:27:43</nova:creationTime>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:user uuid="7dfe92a3bd9c43bbb3d4590ff5a57173">tempest-DeleteServersAdminTestJSON-834962290-project-member</nova:user>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <nova:project uuid="e3cc4a74e92f45a1811137175eae3af2">tempest-DeleteServersAdminTestJSON-834962290</nova:project>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="serial">8aa2f978-353e-4f81-bd81-2c446cf4f5f6</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="uuid">8aa2f978-353e-4f81-bd81-2c446cf4f5f6</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/console.log" append="off"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:27:45 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:27:45 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:27:45 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:27:45 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.054 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.054 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.055 226890 INFO nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Using config drive#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.078 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.301 226890 DEBUG nova.network.neutron [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.307 226890 INFO nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Creating config drive at /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.315 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7n311__a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.454 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7n311__a" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.492 226890 DEBUG nova.storage.rbd_utils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] rbd image 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.497 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.654 226890 DEBUG oslo_concurrency.processutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config 8aa2f978-353e-4f81-bd81-2c446cf4f5f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.655 226890 INFO nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Deleting local config drive /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6/disk.config because it was imported into RBD.#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.658 226890 DEBUG nova.network.neutron [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.677 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Releasing lock "refresh_cache-21cf2820-8f37-488a-ae75-3f8f45d6ba81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.678 226890 DEBUG nova.compute.manager [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:27:45 np0005588920 systemd-machined[196121]: New machine qemu-9-instance-00000014.
Jan 20 09:27:45 np0005588920 systemd[1]: Started Virtual Machine qemu-9-instance-00000014.
Jan 20 09:27:45 np0005588920 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 20 09:27:45 np0005588920 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000011.scope: Consumed 13.819s CPU time.
Jan 20 09:27:45 np0005588920 systemd-machined[196121]: Machine qemu-7-instance-00000011 terminated.
Jan 20 09:27:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.901 226890 INFO nova.virt.libvirt.driver [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance destroyed successfully.#033[00m
Jan 20 09:27:45 np0005588920 nova_compute[226886]: 2026-01-20 14:27:45.901 226890 DEBUG nova.objects.instance [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'resources' on Instance uuid 21cf2820-8f37-488a-ae75-3f8f45d6ba81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.036 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919266.0357304, 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.037 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.040 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.040 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.044 226890 INFO nova.virt.libvirt.driver [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance spawned successfully.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.045 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.069 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.076 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.082 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.082 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.083 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.083 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.084 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.084 226890 DEBUG nova.virt.libvirt.driver [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.097 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.098 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919266.0402155, 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.098 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] VM Started (Lifecycle Event)#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.158 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.162 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:46.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:46.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.340 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.345 226890 INFO nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Took 3.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.345 226890 DEBUG nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.418 226890 INFO nova.compute.manager [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Took 4.29 seconds to build instance.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.425 226890 INFO nova.virt.libvirt.driver [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Deleting instance files /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81_del#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.426 226890 INFO nova.virt.libvirt.driver [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Deletion of /var/lib/nova/instances/21cf2820-8f37-488a-ae75-3f8f45d6ba81_del complete#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.450 226890 DEBUG oslo_concurrency.lockutils [None req-c240ef59-2dc6-433a-9c8a-bf5e06f21f46 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.482 226890 INFO nova.compute.manager [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.483 226890 DEBUG oslo.service.loopingcall [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.483 226890 DEBUG nova.compute.manager [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.483 226890 DEBUG nova.network.neutron [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.743 226890 DEBUG nova.network.neutron [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.785 226890 DEBUG nova.network.neutron [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.800 226890 INFO nova.compute.manager [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Took 0.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.865 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.865 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:46 np0005588920 nova_compute[226886]: 2026-01-20 14:27:46.952 226890 DEBUG oslo_concurrency.processutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:47 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1118793105' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.435 226890 DEBUG oslo_concurrency.processutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.440 226890 DEBUG nova.compute.provider_tree [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.459 226890 DEBUG nova.scheduler.client.report [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.481 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.516 226890 INFO nova.scheduler.client.report [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Deleted allocations for instance 21cf2820-8f37-488a-ae75-3f8f45d6ba81#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588920 nova_compute[226886]: 2026-01-20 14:27:47.675 226890 DEBUG oslo_concurrency.lockutils [None req-26e87279-8642-477e-bb59-04aaf6f47e6f 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "21cf2820-8f37-488a-ae75-3f8f45d6ba81" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.001 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919253.0008953, 665a5c82-dfcf-43d5-9d42-1f3046305b5b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.001 226890 INFO nova.compute.manager [-] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.014 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.015 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.015 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.015 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.016 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.017 226890 INFO nova.compute.manager [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Terminating instance#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.019 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "refresh_cache-8aa2f978-353e-4f81-bd81-2c446cf4f5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.019 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquired lock "refresh_cache-8aa2f978-353e-4f81-bd81-2c446cf4f5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.020 226890 DEBUG nova.network.neutron [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.022 226890 DEBUG nova.compute.manager [None req-84cb008c-4857-4e0c-8ebd-30f40ff8aa36 - - - - - -] [instance: 665a5c82-dfcf-43d5-9d42-1f3046305b5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:48.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:48 np0005588920 nova_compute[226886]: 2026-01-20 14:27:48.308 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:48.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:27:49 np0005588920 nova_compute[226886]: 2026-01-20 14:27:49.309 226890 DEBUG nova.network.neutron [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.029 226890 DEBUG nova.network.neutron [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.042 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Releasing lock "refresh_cache-8aa2f978-353e-4f81-bd81-2c446cf4f5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.043 226890 DEBUG nova.compute.manager [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:27:50 np0005588920 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 20 09:27:50 np0005588920 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000014.scope: Consumed 4.455s CPU time.
Jan 20 09:27:50 np0005588920 systemd-machined[196121]: Machine qemu-9-instance-00000014 terminated.
Jan 20 09:27:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:50.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.265 226890 INFO nova.virt.libvirt.driver [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance destroyed successfully.#033[00m
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.265 226890 DEBUG nova.objects.instance [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lazy-loading 'resources' on Instance uuid 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.978 226890 INFO nova.virt.libvirt.driver [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Deleting instance files /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6_del#033[00m
Jan 20 09:27:50 np0005588920 nova_compute[226886]: 2026-01-20 14:27:50.979 226890 INFO nova.virt.libvirt.driver [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Deletion of /var/lib/nova/instances/8aa2f978-353e-4f81-bd81-2c446cf4f5f6_del complete#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.049 226890 INFO nova.compute.manager [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Took 1.01 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.050 226890 DEBUG oslo.service.loopingcall [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.051 226890 DEBUG nova.compute.manager [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.051 226890 DEBUG nova.network.neutron [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.297 226890 DEBUG nova.network.neutron [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.333 226890 DEBUG nova.network.neutron [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.369 226890 INFO nova.compute.manager [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Took 0.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.426 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.427 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.469 226890 DEBUG oslo_concurrency.processutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:51 np0005588920 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 20 09:27:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1556740575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.909 226890 DEBUG oslo_concurrency.processutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.914 226890 DEBUG nova.compute.provider_tree [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.927 226890 DEBUG nova.scheduler.client.report [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.949 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:51 np0005588920 nova_compute[226886]: 2026-01-20 14:27:51.976 226890 INFO nova.scheduler.client.report [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Deleted allocations for instance 8aa2f978-353e-4f81-bd81-2c446cf4f5f6#033[00m
Jan 20 09:27:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:27:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5091 writes, 26K keys, 5091 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 5091 writes, 5091 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1541 writes, 7453 keys, 1541 commit groups, 1.0 writes per commit group, ingest: 15.90 MB, 0.03 MB/s#012Interval WAL: 1541 writes, 1541 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     67.9      0.44              0.13        14    0.031       0      0       0.0       0.0#012  L6      1/0    8.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    106.0     88.2      1.18              0.42        13    0.091     61K   6802       0.0       0.0#012 Sum      1/0    8.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     77.4     82.8      1.62              0.55        27    0.060     61K   6802       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.2    105.5    108.3      0.46              0.19        10    0.046     26K   2531       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    106.0     88.2      1.18              0.42        13    0.091     61K   6802       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     68.2      0.44              0.13        13    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.029, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.13 GB write, 0.07 MB/s write, 0.12 GB read, 0.07 MB/s read, 1.6 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 12.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000124 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(709,11.96 MB,3.93468%) FilterBlock(27,179.42 KB,0.0576371%) IndexBlock(27,325.92 KB,0.104698%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:27:52 np0005588920 nova_compute[226886]: 2026-01-20 14:27:52.122 226890 DEBUG oslo_concurrency.lockutils [None req-0d164e0f-55a6-427a-9f5c-ab44ce7c6c24 7dfe92a3bd9c43bbb3d4590ff5a57173 e3cc4a74e92f45a1811137175eae3af2 - - default default] Lock "8aa2f978-353e-4f81-bd81-2c446cf4f5f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:52.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:52.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:52 np0005588920 nova_compute[226886]: 2026-01-20 14:27:52.607 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:53 np0005588920 nova_compute[226886]: 2026-01-20 14:27:53.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4134874017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:54.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:54.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:56.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:56.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:57 np0005588920 nova_compute[226886]: 2026-01-20 14:27:57.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:58.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:27:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:58.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:58 np0005588920 nova_compute[226886]: 2026-01-20 14:27:58.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:00.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:00.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.575 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.576 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.598 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.668 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.669 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.677 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.677 226890 INFO nova.compute.claims [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.790 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.898 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919265.8968225, 21cf2820-8f37-488a-ae75-3f8f45d6ba81 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.899 226890 INFO nova.compute.manager [-] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:28:00 np0005588920 nova_compute[226886]: 2026-01-20 14:28:00.922 226890 DEBUG nova.compute.manager [None req-e32456f7-d3a0-46f3-a538-604b74e549e4 - - - - - -] [instance: 21cf2820-8f37-488a-ae75-3f8f45d6ba81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2407875768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.257 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.267 226890 DEBUG nova.compute.provider_tree [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.297 226890 DEBUG nova.scheduler.client.report [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.330 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.331 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.375 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.376 226890 DEBUG nova.network.neutron [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.397 226890 INFO nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.417 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.524 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.527 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.527 226890 INFO nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Creating image(s)#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.564 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.604 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.643 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.649 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.727 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.729 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.730 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.731 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.770 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:01 np0005588920 nova_compute[226886]: 2026-01-20 14:28:01.775 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:02 np0005588920 podman[235296]: 2026-01-20 14:28:02.056868596 +0000 UTC m=+0.125837898 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.151 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.377s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.215 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] resizing rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:28:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:02.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.246 226890 DEBUG nova.network.neutron [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.246 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.309 226890 DEBUG nova.objects.instance [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:02.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.370 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.370 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Ensure instance console log exists: /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.371 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.371 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.372 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.373 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.379 226890 WARNING nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.390 226890 DEBUG nova.virt.libvirt.host [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.391 226890 DEBUG nova.virt.libvirt.host [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.397 226890 DEBUG nova.virt.libvirt.host [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.398 226890 DEBUG nova.virt.libvirt.host [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.400 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.401 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.402 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.402 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.403 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.403 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.403 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.404 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.405 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.406 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.406 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.407 226890 DEBUG nova.virt.hardware [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.412 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.613 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2155246561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.861 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.884 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:02 np0005588920 nova_compute[226886]: 2026-01-20 14:28:02.887 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3793975136' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.396 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.400 226890 DEBUG nova.objects.instance [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.421 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <uuid>29f0b4d4-abf0-46e7-bf67-38e71eb42e28</uuid>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <name>instance-00000016</name>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:name>tempest-MigrationsAdminTest-server-920976466</nova:name>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:28:02</nova:creationTime>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="serial">29f0b4d4-abf0-46e7-bf67-38e71eb42e28</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="uuid">29f0b4d4-abf0-46e7-bf67-38e71eb42e28</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/console.log" append="off"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:28:03 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:28:03 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:28:03 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:28:03 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.492 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.493 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.494 226890 INFO nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Using config drive#033[00m
Jan 20 09:28:03 np0005588920 nova_compute[226886]: 2026-01-20 14:28:03.526 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:04.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.301 226890 INFO nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Creating config drive at /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config#033[00m
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.311 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprsivwu_r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:04.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.456 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprsivwu_r" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.500 226890 DEBUG nova.storage.rbd_utils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.504 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.659 226890 DEBUG oslo_concurrency.processutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config 29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:04 np0005588920 nova_compute[226886]: 2026-01-20 14:28:04.660 226890 INFO nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Deleting local config drive /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/disk.config because it was imported into RBD.#033[00m
Jan 20 09:28:04 np0005588920 systemd-machined[196121]: New machine qemu-10-instance-00000016.
Jan 20 09:28:04 np0005588920 systemd[1]: Started Virtual Machine qemu-10-instance-00000016.
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.217 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919285.2164638, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.217 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.222 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.223 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.228 226890 INFO nova.virt.libvirt.driver [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance spawned successfully.#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.228 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.241 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.253 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.262 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.263 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.264 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.265 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.266 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.267 226890 DEBUG nova.virt.libvirt.driver [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.275 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919270.2624958, 8aa2f978-353e-4f81-bd81-2c446cf4f5f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.275 226890 INFO nova.compute.manager [-] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.318 226890 DEBUG nova.compute.manager [None req-9a1b7f60-9b1d-42ee-8e48-47a2137aafbe - - - - - -] [instance: 8aa2f978-353e-4f81-bd81-2c446cf4f5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.320 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.320 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919285.219788, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.320 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Started (Lifecycle Event)#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.336 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.339 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.365 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.371 226890 INFO nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Took 3.85 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.371 226890 DEBUG nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.432 226890 INFO nova.compute.manager [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Took 4.79 seconds to build instance.#033[00m
Jan 20 09:28:05 np0005588920 nova_compute[226886]: 2026-01-20 14:28:05.448 226890 DEBUG oslo_concurrency.lockutils [None req-e5d5acb1-55b8-4b6a-917f-59ba5577043a 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:06.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:06.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:07 np0005588920 nova_compute[226886]: 2026-01-20 14:28:07.615 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:08.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:08.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:08 np0005588920 nova_compute[226886]: 2026-01-20 14:28:08.359 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:08 np0005588920 nova_compute[226886]: 2026-01-20 14:28:08.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:09 np0005588920 nova_compute[226886]: 2026-01-20 14:28:09.205 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:09 np0005588920 nova_compute[226886]: 2026-01-20 14:28:09.205 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:09 np0005588920 nova_compute[226886]: 2026-01-20 14:28:09.206 226890 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:09 np0005588920 nova_compute[226886]: 2026-01-20 14:28:09.733 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:09 np0005588920 nova_compute[226886]: 2026-01-20 14:28:09.834 226890 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.107 226890 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.132 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.228 226890 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.229 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Creating file /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/191a6f4bae1b47db97ecff8c436645dd.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.229 226890 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/191a6f4bae1b47db97ecff8c436645dd.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:10.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:10.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.680 226890 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/191a6f4bae1b47db97ecff8c436645dd.tmp" returned: 1 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.681 226890 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/191a6f4bae1b47db97ecff8c436645dd.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.681 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Creating directory /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.682 226890 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.959 226890 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:10 np0005588920 nova_compute[226886]: 2026-01-20 14:28:10.964 226890 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:28:11 np0005588920 nova_compute[226886]: 2026-01-20 14:28:11.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:12.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:12.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:12 np0005588920 nova_compute[226886]: 2026-01-20 14:28:12.617 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:28:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2012390759' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:28:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:28:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2012390759' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.743 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.743 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.743 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:28:13 np0005588920 nova_compute[226886]: 2026-01-20 14:28:13.744 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.318 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:28:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:14.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.704 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.725 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:14 np0005588920 nova_compute[226886]: 2026-01-20 14:28:14.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:28:15 np0005588920 podman[235579]: 2026-01-20 14:28:15.013030514 +0000 UTC m=+0.088207102 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:28:15 np0005588920 nova_compute[226886]: 2026-01-20 14:28:15.747 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:15 np0005588920 nova_compute[226886]: 2026-01-20 14:28:15.748 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:16.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:16.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:16.429 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:16.430 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:16.430 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3394207389' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:17 np0005588920 nova_compute[226886]: 2026-01-20 14:28:17.619 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:18.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:18.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:18 np0005588920 nova_compute[226886]: 2026-01-20 14:28:18.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:18 np0005588920 nova_compute[226886]: 2026-01-20 14:28:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.777 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.779 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.780 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:28:19 np0005588920 nova_compute[226886]: 2026-01-20 14:28:19.780 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:20.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4021864044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.333 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:20.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.412 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.413 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.611 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.613 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4650MB free_disk=20.861431121826172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.614 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.615 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:20 np0005588920 nova_compute[226886]: 2026-01-20 14:28:20.744 226890 INFO nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating resource usage from migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.010 226890 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.079 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.080 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.080 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.205 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.351 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.352 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.375 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.392 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.424 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/117801400' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.900 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:21 np0005588920 nova_compute[226886]: 2026-01-20 14:28:21.908 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.071 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.098 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.099 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.100 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.101 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.131 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:28:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:22.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:22.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:22 np0005588920 nova_compute[226886]: 2026-01-20 14:28:22.623 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:23 np0005588920 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 20 09:28:23 np0005588920 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Consumed 13.806s CPU time.
Jan 20 09:28:23 np0005588920 systemd-machined[196121]: Machine qemu-10-instance-00000016 terminated.
Jan 20 09:28:23 np0005588920 nova_compute[226886]: 2026-01-20 14:28:23.428 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:28:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 12K writes, 50K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 3618 syncs, 3.41 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6905 writes, 27K keys, 6905 commit groups, 1.0 writes per commit group, ingest: 31.53 MB, 0.05 MB/s#012Interval WAL: 6905 writes, 2687 syncs, 2.57 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.026 226890 INFO nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.032 226890 INFO nova.virt.libvirt.driver [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance destroyed successfully.#033[00m
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.036 226890 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.037 226890 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:24.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.277 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.278 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:24 np0005588920 nova_compute[226886]: 2026-01-20 14:28:24.278 226890 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:24.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:25 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:28:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 20 09:28:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:26.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:26.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:27 np0005588920 nova_compute[226886]: 2026-01-20 14:28:27.626 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:28.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:28.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:28 np0005588920 nova_compute[226886]: 2026-01-20 14:28:28.479 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:30.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:30.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.403 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.421 226890 WARNING nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor.#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.532 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.533 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.533 226890 DEBUG nova.compute.manager [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Going to confirm migration 5 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.960 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.960 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.961 226890 DEBUG nova.network.neutron [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:31 np0005588920 nova_compute[226886]: 2026-01-20 14:28:31.961 226890 DEBUG nova.objects.instance [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'info_cache' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.231 226890 DEBUG nova.network.neutron [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:28:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:32.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:32.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.556 226890 DEBUG nova.network.neutron [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.580 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.581 226890 DEBUG nova.objects.instance [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:32 np0005588920 nova_compute[226886]: 2026-01-20 14:28:32.696 226890 DEBUG nova.storage.rbd_utils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] removing snapshot(nova-resize) on rbd image(29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:28:33 np0005588920 podman[235682]: 2026-01-20 14:28:33.051549695 +0000 UTC m=+0.123400200 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2290410301' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2290410301' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.126 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.127 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.199 226890 DEBUG oslo_concurrency.processutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.485 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1026837311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.657 226890 DEBUG oslo_concurrency.processutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.664 226890 DEBUG nova.compute.provider_tree [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.699 226890 DEBUG nova.scheduler.client.report [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.762 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.856 226890 INFO nova.scheduler.client.report [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Deleted allocation for migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb#033[00m
Jan 20 09:28:33 np0005588920 nova_compute[226886]: 2026-01-20 14:28:33.976 226890 DEBUG oslo_concurrency.lockutils [None req-a8f7e5ca-94f7-4dcd-9d80-48926292c8f4 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:34.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:34.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:36.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:36.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:37 np0005588920 nova_compute[226886]: 2026-01-20 14:28:37.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:38.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:38 np0005588920 nova_compute[226886]: 2026-01-20 14:28:38.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:38 np0005588920 nova_compute[226886]: 2026-01-20 14:28:38.544 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919303.5434382, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:38 np0005588920 nova_compute[226886]: 2026-01-20 14:28:38.545 226890 INFO nova.compute.manager [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:28:38 np0005588920 nova_compute[226886]: 2026-01-20 14:28:38.606 226890 DEBUG nova.compute.manager [None req-94aa9b6d-d577-438a-9a17-766350dc4a05 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:39.891 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:28:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:39.892 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:28:39 np0005588920 nova_compute[226886]: 2026-01-20 14:28:39.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:40.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:40.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 20 09:28:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:28:41.894 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:42.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:42.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:42 np0005588920 nova_compute[226886]: 2026-01-20 14:28:42.646 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:43 np0005588920 nova_compute[226886]: 2026-01-20 14:28:43.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:44.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:44.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:46 np0005588920 podman[235730]: 2026-01-20 14:28:46.024361372 +0000 UTC m=+0.106970534 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 09:28:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:46.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.498 226890 DEBUG nova.compute.manager [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.629 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.629 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.655 226890 DEBUG nova.objects.instance [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_requests' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.675 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.676 226890 INFO nova.compute.claims [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.676 226890 DEBUG nova.objects.instance [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.690 226890 DEBUG nova.objects.instance [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.748 226890 INFO nova.compute.resource_tracker [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating resource usage from migration 8dc09b06-46b2-4315-8857-eb43dfbe98ff#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.749 226890 DEBUG nova.compute.resource_tracker [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Starting to track incoming migration 8dc09b06-46b2-4315-8857-eb43dfbe98ff with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:28:47 np0005588920 nova_compute[226886]: 2026-01-20 14:28:47.813 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/663865372' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.280 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:48.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.289 226890 DEBUG nova.compute.provider_tree [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.317 226890 DEBUG nova.scheduler.client.report [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.352 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.353 226890 INFO nova.compute.manager [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Migrating#033[00m
Jan 20 09:28:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:48 np0005588920 nova_compute[226886]: 2026-01-20 14:28:48.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:49 np0005588920 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:28:49 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:28:49 np0005588920 systemd-logind[783]: New session 51 of user nova.
Jan 20 09:28:49 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:28:49 np0005588920 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:28:49 np0005588920 systemd[235775]: Queued start job for default target Main User Target.
Jan 20 09:28:49 np0005588920 systemd[235775]: Created slice User Application Slice.
Jan 20 09:28:49 np0005588920 systemd[235775]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:28:49 np0005588920 systemd[235775]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:28:49 np0005588920 systemd[235775]: Reached target Paths.
Jan 20 09:28:49 np0005588920 systemd[235775]: Reached target Timers.
Jan 20 09:28:49 np0005588920 systemd[235775]: Starting D-Bus User Message Bus Socket...
Jan 20 09:28:49 np0005588920 systemd[235775]: Starting Create User's Volatile Files and Directories...
Jan 20 09:28:49 np0005588920 systemd[235775]: Finished Create User's Volatile Files and Directories.
Jan 20 09:28:49 np0005588920 systemd[235775]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:28:49 np0005588920 systemd[235775]: Reached target Sockets.
Jan 20 09:28:49 np0005588920 systemd[235775]: Reached target Basic System.
Jan 20 09:28:49 np0005588920 systemd[235775]: Reached target Main User Target.
Jan 20 09:28:49 np0005588920 systemd[235775]: Startup finished in 173ms.
Jan 20 09:28:49 np0005588920 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:28:49 np0005588920 systemd[1]: Started Session 51 of User nova.
Jan 20 09:28:50 np0005588920 systemd[1]: session-51.scope: Deactivated successfully.
Jan 20 09:28:50 np0005588920 systemd-logind[783]: Session 51 logged out. Waiting for processes to exit.
Jan 20 09:28:50 np0005588920 systemd-logind[783]: Removed session 51.
Jan 20 09:28:50 np0005588920 systemd-logind[783]: New session 53 of user nova.
Jan 20 09:28:50 np0005588920 systemd[1]: Started Session 53 of User nova.
Jan 20 09:28:50 np0005588920 systemd[1]: session-53.scope: Deactivated successfully.
Jan 20 09:28:50 np0005588920 systemd-logind[783]: Session 53 logged out. Waiting for processes to exit.
Jan 20 09:28:50 np0005588920 systemd-logind[783]: Removed session 53.
Jan 20 09:28:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:50.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:50.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:52.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:52.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:52 np0005588920 nova_compute[226886]: 2026-01-20 14:28:52.651 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:53 np0005588920 nova_compute[226886]: 2026-01-20 14:28:53.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:54.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:54.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:28:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:56.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:28:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:56.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:57 np0005588920 nova_compute[226886]: 2026-01-20 14:28:57.656 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:28:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:58.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:58 np0005588920 nova_compute[226886]: 2026-01-20 14:28:58.682 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:28:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.490390825 +0000 UTC m=+0.047010974 container create 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default)
Jan 20 09:28:59 np0005588920 systemd[1]: Started libpod-conmon-7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d.scope.
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.470490871 +0000 UTC m=+0.027111060 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:28:59 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.586720896 +0000 UTC m=+0.143341065 container init 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 09:28:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.602083602 +0000 UTC m=+0.158703741 container start 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.605843978 +0000 UTC m=+0.162464127 container attach 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:28:59 np0005588920 suspicious_lovelace[236205]: 167 167
Jan 20 09:28:59 np0005588920 systemd[1]: libpod-7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d.scope: Deactivated successfully.
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.614052141 +0000 UTC m=+0.170672340 container died 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3)
Jan 20 09:28:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay-337938fcb22bc4d86c08c0a2c4e6c19f20357084123b24d27506dd2ce0e1b244-merged.mount: Deactivated successfully.
Jan 20 09:28:59 np0005588920 podman[236189]: 2026-01-20 14:28:59.673408784 +0000 UTC m=+0.230028933 container remove 7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_lovelace, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:28:59 np0005588920 systemd[1]: libpod-conmon-7da3f457cae363c7abf4bd6f0ece8c2f31cc7ba97bb4c3bfdce733edcd41618d.scope: Deactivated successfully.
Jan 20 09:28:59 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:28:59 np0005588920 podman[236230]: 2026-01-20 14:28:59.920747286 +0000 UTC m=+0.067596277 container create fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:28:59 np0005588920 podman[236230]: 2026-01-20 14:28:59.891551509 +0000 UTC m=+0.038400560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:28:59 np0005588920 systemd[1]: Started libpod-conmon-fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215.scope.
Jan 20 09:29:00 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:29:00 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd005a7ed5131a5ae09d7ce26f5c8b780dcd875bbf8a2196197c613946bc7335/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 09:29:00 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd005a7ed5131a5ae09d7ce26f5c8b780dcd875bbf8a2196197c613946bc7335/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:29:00 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd005a7ed5131a5ae09d7ce26f5c8b780dcd875bbf8a2196197c613946bc7335/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 09:29:00 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd005a7ed5131a5ae09d7ce26f5c8b780dcd875bbf8a2196197c613946bc7335/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 09:29:00 np0005588920 podman[236230]: 2026-01-20 14:29:00.053540941 +0000 UTC m=+0.200389982 container init fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 09:29:00 np0005588920 podman[236230]: 2026-01-20 14:29:00.066996023 +0000 UTC m=+0.213845004 container start fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:29:00 np0005588920 podman[236230]: 2026-01-20 14:29:00.070733809 +0000 UTC m=+0.217582800 container attach fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 20 09:29:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:29:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:29:00 np0005588920 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:29:00 np0005588920 systemd[235775]: Activating special unit Exit the Session...
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped target Main User Target.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped target Basic System.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped target Paths.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped target Sockets.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped target Timers.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:29:00 np0005588920 systemd[235775]: Closed D-Bus User Message Bus Socket.
Jan 20 09:29:00 np0005588920 systemd[235775]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:29:00 np0005588920 systemd[235775]: Removed slice User Application Slice.
Jan 20 09:29:00 np0005588920 systemd[235775]: Reached target Shutdown.
Jan 20 09:29:00 np0005588920 systemd[235775]: Finished Exit the Session.
Jan 20 09:29:00 np0005588920 systemd[235775]: Reached target Exit the Session.
Jan 20 09:29:00 np0005588920 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:29:00 np0005588920 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:29:00 np0005588920 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:29:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:00.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:00 np0005588920 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:29:00 np0005588920 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:29:00 np0005588920 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:29:00 np0005588920 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:29:01 np0005588920 sad_neumann[236247]: [
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:    {
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "available": false,
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "ceph_device": false,
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "lsm_data": {},
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "lvs": [],
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "path": "/dev/sr0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "rejected_reasons": [
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "Has a FileSystem",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "Insufficient space (<5GB)"
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        ],
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        "sys_api": {
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "actuators": null,
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "device_nodes": "sr0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "devname": "sr0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "human_readable_size": "482.00 KB",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "id_bus": "ata",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "model": "QEMU DVD-ROM",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "nr_requests": "2",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "parent": "/dev/sr0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "partitions": {},
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "path": "/dev/sr0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "removable": "1",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "rev": "2.5+",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "ro": "0",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "rotational": "1",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "sas_address": "",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "sas_device_handle": "",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "scheduler_mode": "mq-deadline",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "sectors": 0,
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "sectorsize": "2048",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "size": 493568.0,
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "support_discard": "2048",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "type": "disk",
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:            "vendor": "QEMU"
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:        }
Jan 20 09:29:01 np0005588920 sad_neumann[236247]:    }
Jan 20 09:29:01 np0005588920 sad_neumann[236247]: ]
Jan 20 09:29:01 np0005588920 systemd[1]: libpod-fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215.scope: Deactivated successfully.
Jan 20 09:29:01 np0005588920 systemd[1]: libpod-fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215.scope: Consumed 1.347s CPU time.
Jan 20 09:29:01 np0005588920 podman[236230]: 2026-01-20 14:29:01.402891258 +0000 UTC m=+1.549740239 container died fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 20 09:29:01 np0005588920 systemd[1]: var-lib-containers-storage-overlay-cd005a7ed5131a5ae09d7ce26f5c8b780dcd875bbf8a2196197c613946bc7335-merged.mount: Deactivated successfully.
Jan 20 09:29:01 np0005588920 podman[236230]: 2026-01-20 14:29:01.455316284 +0000 UTC m=+1.602165235 container remove fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_neumann, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:29:01 np0005588920 systemd[1]: libpod-conmon-fe64bd719997387a05dc6756a851cf354d3b615a2f5ac50c55dffbc4a7cdf215.scope: Deactivated successfully.
Jan 20 09:29:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:02.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:02.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:29:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:29:02 np0005588920 nova_compute[226886]: 2026-01-20 14:29:02.660 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:03 np0005588920 nova_compute[226886]: 2026-01-20 14:29:03.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:04 np0005588920 podman[237314]: 2026-01-20 14:29:04.050924404 +0000 UTC m=+0.130569173 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Jan 20 09:29:04 np0005588920 nova_compute[226886]: 2026-01-20 14:29:04.289 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:04 np0005588920 nova_compute[226886]: 2026-01-20 14:29:04.290 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:04 np0005588920 nova_compute[226886]: 2026-01-20 14:29:04.290 226890 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:29:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:04.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:04.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:04 np0005588920 nova_compute[226886]: 2026-01-20 14:29:04.619 226890 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.099 226890 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.125 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.242 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.244 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.245 226890 INFO nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Creating image(s)#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.292 226890 DEBUG nova.storage.rbd_utils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] creating snapshot(nova-resize) on rbd image(87fe16d6-774e-4002-8df4-9eb202621ab9_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:29:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.585 226890 DEBUG nova.objects.instance [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.679 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.680 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Ensure instance console log exists: /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.680 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.680 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.681 226890 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.682 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.685 226890 WARNING nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.691 226890 DEBUG nova.virt.libvirt.host [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.692 226890 DEBUG nova.virt.libvirt.host [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.697 226890 DEBUG nova.virt.libvirt.host [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.698 226890 DEBUG nova.virt.libvirt.host [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.699 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.700 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.701 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.701 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.702 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.702 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.703 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.703 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.704 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.704 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.704 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.705 226890 DEBUG nova.virt.hardware [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.705 226890 DEBUG nova.objects.instance [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:05 np0005588920 nova_compute[226886]: 2026-01-20 14:29:05.726 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2168453940' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.235 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.284 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:06.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2585897243' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.726 226890 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.730 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <uuid>87fe16d6-774e-4002-8df4-9eb202621ab9</uuid>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <name>instance-00000018</name>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <memory>196608</memory>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:name>tempest-MigrationsAdminTest-server-724945079</nova:name>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:29:05</nova:creationTime>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.micro">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:memory>192</nova:memory>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="serial">87fe16d6-774e-4002-8df4-9eb202621ab9</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="uuid">87fe16d6-774e-4002-8df4-9eb202621ab9</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/87fe16d6-774e-4002-8df4-9eb202621ab9_disk">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/console.log" append="off"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:29:06 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:29:06 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:29:06 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:29:06 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.798 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.799 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:06 np0005588920 nova_compute[226886]: 2026-01-20 14:29:06.800 226890 INFO nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Using config drive#033[00m
Jan 20 09:29:06 np0005588920 systemd-machined[196121]: New machine qemu-11-instance-00000018.
Jan 20 09:29:06 np0005588920 systemd[1]: Started Virtual Machine qemu-11-instance-00000018.
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.338 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919347.3353462, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.339 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.341 226890 DEBUG nova.compute.manager [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.345 226890 INFO nova.virt.libvirt.driver [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance running successfully.#033[00m
Jan 20 09:29:07 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.349 226890 DEBUG nova.virt.libvirt.guest [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.349 226890 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.366 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.369 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.417 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.418 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919347.3380616, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.418 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Started (Lifecycle Event)#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.451 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.454 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:29:07 np0005588920 nova_compute[226886]: 2026-01-20 14:29:07.663 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:08.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:08.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:08 np0005588920 nova_compute[226886]: 2026-01-20 14:29:08.731 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:10.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 20 09:29:11 np0005588920 nova_compute[226886]: 2026-01-20 14:29:11.743 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:12.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:12.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:12 np0005588920 nova_compute[226886]: 2026-01-20 14:29:12.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/636651936' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/636651936' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:29:13 np0005588920 nova_compute[226886]: 2026-01-20 14:29:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:13 np0005588920 nova_compute[226886]: 2026-01-20 14:29:13.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:29:13 np0005588920 nova_compute[226886]: 2026-01-20 14:29:13.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:29:13 np0005588920 nova_compute[226886]: 2026-01-20 14:29:13.733 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1914442263' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:29:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1914442263' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:29:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:14.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:14 np0005588920 nova_compute[226886]: 2026-01-20 14:29:14.397 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:14 np0005588920 nova_compute[226886]: 2026-01-20 14:29:14.397 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:14 np0005588920 nova_compute[226886]: 2026-01-20 14:29:14.397 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:29:14 np0005588920 nova_compute[226886]: 2026-01-20 14:29:14.398 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:14.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:14 np0005588920 nova_compute[226886]: 2026-01-20 14:29:14.981 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.473 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.517 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.517 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.518 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:15 np0005588920 nova_compute[226886]: 2026-01-20 14:29:15.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:29:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:16.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:16.430 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:16.431 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:16.431 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:16.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:16 np0005588920 nova_compute[226886]: 2026-01-20 14:29:16.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:17 np0005588920 podman[237601]: 2026-01-20 14:29:17.040968632 +0000 UTC m=+0.097787993 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:29:17 np0005588920 nova_compute[226886]: 2026-01-20 14:29:17.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:17 np0005588920 nova_compute[226886]: 2026-01-20 14:29:17.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:18.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3599865667' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:18.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:18 np0005588920 nova_compute[226886]: 2026-01-20 14:29:18.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:18 np0005588920 nova_compute[226886]: 2026-01-20 14:29:18.774 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:19 np0005588920 nova_compute[226886]: 2026-01-20 14:29:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:20.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.769 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.769 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.769 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.770 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:29:21 np0005588920 nova_compute[226886]: 2026-01-20 14:29:21.770 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/972668459' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.343 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:29:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:22.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.488 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.489 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.679 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.680 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4693MB free_disk=20.878726959228516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.680 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.680 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.841 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 87fe16d6-774e-4002-8df4-9eb202621ab9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.841 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.841 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:29:22 np0005588920 nova_compute[226886]: 2026-01-20 14:29:22.988 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/423882642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.452 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.459 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.483 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.536 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.537 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:23 np0005588920 nova_compute[226886]: 2026-01-20 14:29:23.814 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:24.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:24.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.395254) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366395347, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2464, "num_deletes": 256, "total_data_size": 5766435, "memory_usage": 5862888, "flush_reason": "Manual Compaction"}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366423417, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3707485, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25509, "largest_seqno": 27968, "table_properties": {"data_size": 3697574, "index_size": 6213, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21636, "raw_average_key_size": 20, "raw_value_size": 3677289, "raw_average_value_size": 3539, "num_data_blocks": 273, "num_entries": 1039, "num_filter_entries": 1039, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919183, "oldest_key_time": 1768919183, "file_creation_time": 1768919366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 28243 microseconds, and 16309 cpu microseconds.
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.423492) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3707485 bytes OK
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.423522) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.425281) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.425306) EVENT_LOG_v1 {"time_micros": 1768919366425298, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.425329) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5755423, prev total WAL file size 5755423, number of live WAL files 2.
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.427528) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3620KB)], [51(8855KB)]
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366427607, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12775818, "oldest_snapshot_seqno": -1}
Jan 20 09:29:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:26.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5464 keys, 10745469 bytes, temperature: kUnknown
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366495094, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 10745469, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10706891, "index_size": 23812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13701, "raw_key_size": 137005, "raw_average_key_size": 25, "raw_value_size": 10606353, "raw_average_value_size": 1941, "num_data_blocks": 980, "num_entries": 5464, "num_filter_entries": 5464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.495491) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 10745469 bytes
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.496842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.8 rd, 158.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.6 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 5995, records dropped: 531 output_compression: NoCompression
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.496870) EVENT_LOG_v1 {"time_micros": 1768919366496857, "job": 30, "event": "compaction_finished", "compaction_time_micros": 67665, "compaction_time_cpu_micros": 39798, "output_level": 6, "num_output_files": 1, "total_output_size": 10745469, "num_input_records": 5995, "num_output_records": 5464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366498063, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366501110, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.427407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.501222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.501227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.501230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.501232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:26.501234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:27 np0005588920 nova_compute[226886]: 2026-01-20 14:29:27.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:28.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:28 np0005588920 nova_compute[226886]: 2026-01-20 14:29:28.847 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:30.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:32.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:32.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:32 np0005588920 nova_compute[226886]: 2026-01-20 14:29:32.676 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:33 np0005588920 nova_compute[226886]: 2026-01-20 14:29:33.850 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:34.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:34.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:35 np0005588920 podman[237664]: 2026-01-20 14:29:35.064674273 +0000 UTC m=+0.138200489 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:29:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:36.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:36.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:37 np0005588920 nova_compute[226886]: 2026-01-20 14:29:37.679 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:38.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:38.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:38 np0005588920 nova_compute[226886]: 2026-01-20 14:29:38.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:40.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:40.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:41.326 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:41 np0005588920 nova_compute[226886]: 2026-01-20 14:29:41.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:41.328 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:29:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:42.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:42.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:42 np0005588920 nova_compute[226886]: 2026-01-20 14:29:42.681 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:43 np0005588920 nova_compute[226886]: 2026-01-20 14:29:43.887 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:44.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:44.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.168121) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386168234, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 433, "num_deletes": 255, "total_data_size": 518289, "memory_usage": 528120, "flush_reason": "Manual Compaction"}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386173601, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 342211, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27973, "largest_seqno": 28401, "table_properties": {"data_size": 339776, "index_size": 535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5437, "raw_average_key_size": 17, "raw_value_size": 335011, "raw_average_value_size": 1053, "num_data_blocks": 24, "num_entries": 318, "num_filter_entries": 318, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919367, "oldest_key_time": 1768919367, "file_creation_time": 1768919386, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 5506 microseconds, and 1847 cpu microseconds.
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.173641) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 342211 bytes OK
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.173656) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175587) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175608) EVENT_LOG_v1 {"time_micros": 1768919386175601, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175627) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 515567, prev total WAL file size 515567, number of live WAL files 2.
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.176142) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353039' seq:72057594037927935, type:22 .. '6C6F676D00373630' seq:0, type:0; will stop at (end)
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(334KB)], [54(10MB)]
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386176178, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 11087680, "oldest_snapshot_seqno": -1}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5264 keys, 10979867 bytes, temperature: kUnknown
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386280296, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 10979867, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10941740, "index_size": 23873, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13189, "raw_key_size": 134015, "raw_average_key_size": 25, "raw_value_size": 10843852, "raw_average_value_size": 2060, "num_data_blocks": 979, "num_entries": 5264, "num_filter_entries": 5264, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919386, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.280556) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 10979867 bytes
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.282183) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.4 rd, 105.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 10.2 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(64.5) write-amplify(32.1) OK, records in: 5782, records dropped: 518 output_compression: NoCompression
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.282219) EVENT_LOG_v1 {"time_micros": 1768919386282210, "job": 32, "event": "compaction_finished", "compaction_time_micros": 104200, "compaction_time_cpu_micros": 34226, "output_level": 6, "num_output_files": 1, "total_output_size": 10979867, "num_input_records": 5782, "num_output_records": 5264, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386282387, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386283917, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.176071) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:46.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:46.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 20 09:29:47 np0005588920 nova_compute[226886]: 2026-01-20 14:29:47.686 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:47 np0005588920 podman[237690]: 2026-01-20 14:29:47.999306777 +0000 UTC m=+0.067528503 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:29:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:48.329 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:48.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:48.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:48 np0005588920 nova_compute[226886]: 2026-01-20 14:29:48.889 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.079 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.080 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.097 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.189 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.190 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.199 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.200 226890 INFO nova.compute.claims [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:29:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:50.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:50.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:50 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.536 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/956641195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:50.999 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.009 226890 DEBUG nova.compute.provider_tree [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.030 226890 DEBUG nova.scheduler.client.report [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.061 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.063 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.124 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.125 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.159 226890 INFO nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.214 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.409 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.411 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.412 226890 INFO nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Creating image(s)#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.443 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.477 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.512 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.518 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.554 226890 DEBUG nova.policy [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cec872a00f742d78563d6d16fc545cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f151250c04467bb4f6a229dda16fc5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.608 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.609 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.609 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.610 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.639 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.642 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ad1be106-796f-45ef-8eb7-afa4c072b371_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.923 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ad1be106-796f-45ef-8eb7-afa4c072b371_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:51 np0005588920 nova_compute[226886]: 2026-01-20 14:29:51.987 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] resizing rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.086 226890 DEBUG nova.objects.instance [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'migration_context' on Instance uuid ad1be106-796f-45ef-8eb7-afa4c072b371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.105 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.106 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Ensure instance console log exists: /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.106 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.107 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.107 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:52.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:52.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.561 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Successfully created port: a3c691ea-b51e-4524-84af-3cbb50dd9a0c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:29:52 np0005588920 nova_compute[226886]: 2026-01-20 14:29:52.688 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:53 np0005588920 nova_compute[226886]: 2026-01-20 14:29:53.891 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:54.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.486 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Successfully updated port: a3c691ea-b51e-4524-84af-3cbb50dd9a0c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.517 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.517 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.517 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:29:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:54.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.685 226890 DEBUG nova.compute.manager [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.685 226890 DEBUG nova.compute.manager [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing instance network info cache due to event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.685 226890 DEBUG oslo_concurrency.lockutils [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:54 np0005588920 nova_compute[226886]: 2026-01-20 14:29:54.785 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.348 226890 DEBUG nova.network.neutron [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:56.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.386 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.387 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Instance network_info: |[{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.387 226890 DEBUG oslo_concurrency.lockutils [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.388 226890 DEBUG nova.network.neutron [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.393 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Start _get_guest_xml network_info=[{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.400 226890 WARNING nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.404 226890 DEBUG nova.virt.libvirt.host [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.405 226890 DEBUG nova.virt.libvirt.host [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.409 226890 DEBUG nova.virt.libvirt.host [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.410 226890 DEBUG nova.virt.libvirt.host [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.412 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.412 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.413 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.413 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.414 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.414 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.414 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.415 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.415 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.416 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.416 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.417 226890 DEBUG nova.virt.hardware [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.422 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:56.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2270811472' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.877 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.903 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:56 np0005588920 nova_compute[226886]: 2026-01-20 14:29:56.907 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3031609186' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.324 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.325 226890 DEBUG nova.virt.libvirt.vif [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1204930815',display_name='tempest-FloatingIPsAssociationTestJSON-server-1204930815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1204930815',id=28,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-rz2ag1j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:29:51Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=ad1be106-796f-45ef-8eb7-afa4c072b371,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.326 226890 DEBUG nova.network.os_vif_util [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.326 226890 DEBUG nova.network.os_vif_util [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.328 226890 DEBUG nova.objects.instance [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad1be106-796f-45ef-8eb7-afa4c072b371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.345 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <uuid>ad1be106-796f-45ef-8eb7-afa4c072b371</uuid>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <name>instance-0000001c</name>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1204930815</nova:name>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:29:56</nova:creationTime>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:user uuid="0cec872a00f742d78563d6d16fc545cb">tempest-FloatingIPsAssociationTestJSON-146254261-project-member</nova:user>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:project uuid="78f151250c04467bb4f6a229dda16fc5">tempest-FloatingIPsAssociationTestJSON-146254261</nova:project>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <nova:port uuid="a3c691ea-b51e-4524-84af-3cbb50dd9a0c">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="serial">ad1be106-796f-45ef-8eb7-afa4c072b371</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="uuid">ad1be106-796f-45ef-8eb7-afa4c072b371</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ad1be106-796f-45ef-8eb7-afa4c072b371_disk">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:03:26:e5"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <target dev="tapa3c691ea-b5"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/console.log" append="off"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:29:57 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:29:57 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:29:57 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:29:57 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.346 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Preparing to wait for external event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.346 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.346 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.346 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.347 226890 DEBUG nova.virt.libvirt.vif [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1204930815',display_name='tempest-FloatingIPsAssociationTestJSON-server-1204930815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1204930815',id=28,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-rz2ag1j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:29:51Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=ad1be106-796f-45ef-8eb7-afa4c072b371,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.347 226890 DEBUG nova.network.os_vif_util [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.347 226890 DEBUG nova.network.os_vif_util [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.348 226890 DEBUG os_vif [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.348 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.349 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.349 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.353 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.353 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3c691ea-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.353 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3c691ea-b5, col_values=(('external_ids', {'iface-id': 'a3c691ea-b51e-4524-84af-3cbb50dd9a0c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:26:e5', 'vm-uuid': 'ad1be106-796f-45ef-8eb7-afa4c072b371'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:57 np0005588920 NetworkManager[49076]: <info>  [1768919397.3555] manager: (tapa3c691ea-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.363 226890 INFO os_vif [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5')#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.419 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.420 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.420 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No VIF found with MAC fa:16:3e:03:26:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.421 226890 INFO nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Using config drive#033[00m
Jan 20 09:29:57 np0005588920 nova_compute[226886]: 2026-01-20 14:29:57.443 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:58.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:29:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:58.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:58 np0005588920 nova_compute[226886]: 2026-01-20 14:29:58.981 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.463 226890 INFO nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Creating config drive at /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.467 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptsp1krro execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.609 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptsp1krro" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.639 226890 DEBUG nova.storage.rbd_utils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.642 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.824 226890 DEBUG oslo_concurrency.processutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config ad1be106-796f-45ef-8eb7-afa4c072b371_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.826 226890 INFO nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Deleting local config drive /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371/disk.config because it was imported into RBD.#033[00m
Jan 20 09:29:59 np0005588920 kernel: tapa3c691ea-b5: entered promiscuous mode
Jan 20 09:29:59 np0005588920 NetworkManager[49076]: <info>  [1768919399.8766] manager: (tapa3c691ea-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.879 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:29:59Z|00083|binding|INFO|Claiming lport a3c691ea-b51e-4524-84af-3cbb50dd9a0c for this chassis.
Jan 20 09:29:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:29:59Z|00084|binding|INFO|a3c691ea-b51e-4524-84af-3cbb50dd9a0c: Claiming fa:16:3e:03:26:e5 10.100.0.13
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.884 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.890 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.904 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:26:e5 10.100.0.13'], port_security=['fa:16:3e:03:26:e5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ad1be106-796f-45ef-8eb7-afa4c072b371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e6deef-9aca-4d36-8215-4517982a86a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f151250c04467bb4f6a229dda16fc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dd36e8d2-993a-4618-8fff-62abafaadfd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86700b79-bb44-47f0-88a5-d4c8eda3acbb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=a3c691ea-b51e-4524-84af-3cbb50dd9a0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.906 144128 INFO neutron.agent.ovn.metadata.agent [-] Port a3c691ea-b51e-4524-84af-3cbb50dd9a0c in datapath 01e6deef-9aca-4d36-8215-4517982a86a3 bound to our chassis#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.908 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01e6deef-9aca-4d36-8215-4517982a86a3#033[00m
Jan 20 09:29:59 np0005588920 systemd-machined[196121]: New machine qemu-12-instance-0000001c.
Jan 20 09:29:59 np0005588920 systemd-udevd[238033]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.921 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[56987c6d-f86d-46ff-902c-cbfffe346ff4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.922 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01e6deef-91 in ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.924 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01e6deef-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.924 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2ed999-6c82-4f35-8428-f74621ab9c34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.925 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f194c6a-4e80-49e8-b17f-b2778d03a40f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 NetworkManager[49076]: <info>  [1768919399.9379] device (tapa3c691ea-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:29:59 np0005588920 NetworkManager[49076]: <info>  [1768919399.9385] device (tapa3c691ea-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.940 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a93977bc-e4c7-4f36-837f-bc74d408b6bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 systemd[1]: Started Virtual Machine qemu-12-instance-0000001c.
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.959 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b19c14-0e86-4f8a-97c7-5315e2432b64]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.985 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:29:59Z|00085|binding|INFO|Setting lport a3c691ea-b51e-4524-84af-3cbb50dd9a0c ovn-installed in OVS
Jan 20 09:29:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:29:59Z|00086|binding|INFO|Setting lport a3c691ea-b51e-4524-84af-3cbb50dd9a0c up in Southbound
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.989 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[130128cd-539f-447f-882d-3d6867b46728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 nova_compute[226886]: 2026-01-20 14:29:59.991 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588920 NetworkManager[49076]: <info>  [1768919399.9971] manager: (tap01e6deef-90): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 20 09:29:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:29:59.996 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b1803f30-db8c-409f-af8b-4c21573d34c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:59 np0005588920 systemd-udevd[238036]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.025 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3dd408-0d34-4af8-902b-059c20ad77a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.027 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b2484e-e517-46e7-975a-7046771261d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 NetworkManager[49076]: <info>  [1768919400.0479] device (tap01e6deef-90): carrier: link connected
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.052 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f43ae5-72ed-4eb0-9077-4463b0b53821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a0945b33-46d3-481b-922f-82ee6a0f0e16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e6deef-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:81:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444173, 'reachable_time': 36767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238065, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.085 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9085b43a-bb72-46ad-b924-3c82d319e5eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:818c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444173, 'tstamp': 444173}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238066, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.103 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bf441d71-72d6-49dc-9802-f1b51bf42625]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e6deef-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:81:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444173, 'reachable_time': 36767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238067, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.137 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[90acc677-c8f4-42d1-8fa0-cf48e5640d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.197 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2d10b8d9-3f67-443a-8c84-19a934b0c1c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.199 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e6deef-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.199 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.200 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01e6deef-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.201 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:00 np0005588920 kernel: tap01e6deef-90: entered promiscuous mode
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.204 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01e6deef-90, col_values=(('external_ids', {'iface-id': 'b3bfa880-f76c-4bab-98ca-24729b0d77e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:00 np0005588920 NetworkManager[49076]: <info>  [1768919400.2050] manager: (tap01e6deef-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.206 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:00Z|00087|binding|INFO|Releasing lport b3bfa880-f76c-4bab-98ca-24729b0d77e7 from this chassis (sb_readonly=0)
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.219 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.219 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.220 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[34213e5a-d8a4-44b9-8422-f9ffb049c35f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.221 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-01e6deef-9aca-4d36-8215-4517982a86a3
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 01e6deef-9aca-4d36-8215-4517982a86a3
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:30:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:00.221 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'env', 'PROCESS_TAG=haproxy-01e6deef-9aca-4d36-8215-4517982a86a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01e6deef-9aca-4d36-8215-4517982a86a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.349 226890 DEBUG nova.compute.manager [req-7b68033a-efae-496d-a6a9-f43eb948119a req-5484c84f-3813-4d45-b4fb-d9bff9c124ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.350 226890 DEBUG oslo_concurrency.lockutils [req-7b68033a-efae-496d-a6a9-f43eb948119a req-5484c84f-3813-4d45-b4fb-d9bff9c124ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.350 226890 DEBUG oslo_concurrency.lockutils [req-7b68033a-efae-496d-a6a9-f43eb948119a req-5484c84f-3813-4d45-b4fb-d9bff9c124ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.350 226890 DEBUG oslo_concurrency.lockutils [req-7b68033a-efae-496d-a6a9-f43eb948119a req-5484c84f-3813-4d45-b4fb-d9bff9c124ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.350 226890 DEBUG nova.compute.manager [req-7b68033a-efae-496d-a6a9-f43eb948119a req-5484c84f-3813-4d45-b4fb-d9bff9c124ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Processing event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:30:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:00.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:00.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.560 226890 DEBUG nova.network.neutron [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updated VIF entry in instance network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.561 226890 DEBUG nova.network.neutron [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:00 np0005588920 podman[238099]: 2026-01-20 14:30:00.593261221 +0000 UTC m=+0.078678499 container create cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 09:30:00 np0005588920 systemd[1]: Started libpod-conmon-cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd.scope.
Jan 20 09:30:00 np0005588920 podman[238099]: 2026-01-20 14:30:00.545385882 +0000 UTC m=+0.030803180 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:30:00 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:30:00 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9fe54e6f5c163d14a3f7d4740b1e71daaabb5535ace054c695be911191d57a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:30:00 np0005588920 podman[238099]: 2026-01-20 14:30:00.681466047 +0000 UTC m=+0.166883355 container init cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:30:00 np0005588920 podman[238099]: 2026-01-20 14:30:00.688922614 +0000 UTC m=+0.174339912 container start cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:30:00 np0005588920 nova_compute[226886]: 2026-01-20 14:30:00.712 226890 DEBUG oslo_concurrency.lockutils [req-7e76d64e-5eb7-4d83-b207-da83825e06a9 req-a9a877f8-997a-42fb-b47c-840450facbab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:00 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [NOTICE]   (238118) : New worker (238120) forked
Jan 20 09:30:00 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [NOTICE]   (238118) : Loading success.
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.454 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919401.454473, ad1be106-796f-45ef-8eb7-afa4c072b371 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.455 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.457 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.459 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.462 226890 INFO nova.virt.libvirt.driver [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Instance spawned successfully.#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.462 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.498 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.500 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.515 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.516 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.516 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.516 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.517 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.517 226890 DEBUG nova.virt.libvirt.driver [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.575 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.575 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919401.454643, ad1be106-796f-45ef-8eb7-afa4c072b371 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.575 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.616 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.618 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919401.4590821, ad1be106-796f-45ef-8eb7-afa4c072b371 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.618 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.635 226890 INFO nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Took 10.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.636 226890 DEBUG nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.641 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:01 np0005588920 nova_compute[226886]: 2026-01-20 14:30:01.644 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.024 226890 INFO nova.compute.manager [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Took 11.87 seconds to build instance.#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.059 226890 DEBUG oslo_concurrency.lockutils [None req-de0ec408-9db6-4f42-86b6-2dfebab0a4be 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:02.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.498 226890 DEBUG nova.compute.manager [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.498 226890 DEBUG oslo_concurrency.lockutils [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.499 226890 DEBUG oslo_concurrency.lockutils [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.499 226890 DEBUG oslo_concurrency.lockutils [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.499 226890 DEBUG nova.compute.manager [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] No waiting events found dispatching network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:02 np0005588920 nova_compute[226886]: 2026-01-20 14:30:02.500 226890 WARNING nova.compute.manager [req-fd2bfcf3-3ee9-4e4f-bd29-16e2fde3e9dc req-9dbb8c62-0c2d-4fc1-ace5-9de50946a1f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received unexpected event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c for instance with vm_state active and task_state None.#033[00m
Jan 20 09:30:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:02.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:03 np0005588920 nova_compute[226886]: 2026-01-20 14:30:03.983 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:04.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:04.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:06 np0005588920 podman[238171]: 2026-01-20 14:30:06.087293902 +0000 UTC m=+0.154832393 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 09:30:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:06.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 20 09:30:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:06.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:07 np0005588920 nova_compute[226886]: 2026-01-20 14:30:07.433 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:08.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:08.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:08 np0005588920 nova_compute[226886]: 2026-01-20 14:30:08.984 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:10.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:30:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:30:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:30:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:10.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:12.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:12 np0005588920 nova_compute[226886]: 2026-01-20 14:30:12.437 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:12 np0005588920 nova_compute[226886]: 2026-01-20 14:30:12.534 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:12.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:30:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/478407422' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:30:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:30:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/478407422' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:30:13 np0005588920 nova_compute[226886]: 2026-01-20 14:30:13.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:13 np0005588920 nova_compute[226886]: 2026-01-20 14:30:13.986 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:14.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.944 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.944 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.944 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:30:14 np0005588920 nova_compute[226886]: 2026-01-20 14:30:14.944 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:15 np0005588920 nova_compute[226886]: 2026-01-20 14:30:15.298 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:15Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:26:e5 10.100.0.13
Jan 20 09:30:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:15Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:26:e5 10.100.0.13
Jan 20 09:30:15 np0005588920 nova_compute[226886]: 2026-01-20 14:30:15.959 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:16 np0005588920 nova_compute[226886]: 2026-01-20 14:30:16.127 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:16 np0005588920 nova_compute[226886]: 2026-01-20 14:30:16.127 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:30:16 np0005588920 nova_compute[226886]: 2026-01-20 14:30:16.128 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:16.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:16.431 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:16.432 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:16.432 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:16.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:16 np0005588920 nova_compute[226886]: 2026-01-20 14:30:16.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:17 np0005588920 nova_compute[226886]: 2026-01-20 14:30:17.439 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:17 np0005588920 nova_compute[226886]: 2026-01-20 14:30:17.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:17 np0005588920 nova_compute[226886]: 2026-01-20 14:30:17.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:30:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/61502453' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:18.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:18.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:18 np0005588920 nova_compute[226886]: 2026-01-20 14:30:18.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:18 np0005588920 podman[238334]: 2026-01-20 14:30:18.984263045 +0000 UTC m=+0.061316302 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:30:19 np0005588920 nova_compute[226886]: 2026-01-20 14:30:19.019 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:20.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:20 np0005588920 nova_compute[226886]: 2026-01-20 14:30:20.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:20 np0005588920 nova_compute[226886]: 2026-01-20 14:30:20.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:21 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:22 np0005588920 nova_compute[226886]: 2026-01-20 14:30:22.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:22.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.748 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:30:23 np0005588920 nova_compute[226886]: 2026-01-20 14:30:23.749 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.021 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/256037197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.263 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.355 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.356 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.359 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.360 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:24.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.499 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.500 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4426MB free_disk=20.698200225830078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.501 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.501 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.569 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 87fe16d6-774e-4002-8df4-9eb202621ab9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.569 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ad1be106-796f-45ef-8eb7-afa4c072b371 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.569 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.569 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:30:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:24.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:24 np0005588920 nova_compute[226886]: 2026-01-20 14:30:24.621 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1585323425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.025 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.030 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.053 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.079 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.080 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:25 np0005588920 NetworkManager[49076]: <info>  [1768919425.6411] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 20 09:30:25 np0005588920 NetworkManager[49076]: <info>  [1768919425.6423] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.801 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:25Z|00088|binding|INFO|Releasing lport b3bfa880-f76c-4bab-98ca-24729b0d77e7 from this chassis (sb_readonly=0)
Jan 20 09:30:25 np0005588920 nova_compute[226886]: 2026-01-20 14:30:25.815 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.219 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "c59206c3-51bd-4f98-a12c-d24e73739926" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.220 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.239 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.306 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.307 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.312 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.313 226890 INFO nova.compute.claims [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:30:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:26.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.464 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:26.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2870659096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.913 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.923 226890 DEBUG nova.compute.provider_tree [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.941 226890 DEBUG nova.scheduler.client.report [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.966 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:26 np0005588920 nova_compute[226886]: 2026-01-20 14:30:26.967 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.011 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.011 226890 DEBUG nova.network.neutron [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.030 226890 INFO nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.057 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.280 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.281 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.282 226890 INFO nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Creating image(s)#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.307 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.334 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.356 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.362 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.420 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.422 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.423 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.423 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.462 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.493 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c59206c3-51bd-4f98-a12c-d24e73739926_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.521 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.641 226890 DEBUG nova.network.neutron [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.642 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.820 226890 DEBUG nova.compute.manager [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.821 226890 DEBUG nova.compute.manager [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing instance network info cache due to event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.822 226890 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.822 226890 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:27 np0005588920 nova_compute[226886]: 2026-01-20 14:30:27.823 226890 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:28.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:30:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:28.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:30:28 np0005588920 nova_compute[226886]: 2026-01-20 14:30:28.975 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c59206c3-51bd-4f98-a12c-d24e73739926_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:29 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.056 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.062 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] resizing rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.399 226890 DEBUG nova.compute.manager [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.400 226890 DEBUG nova.compute.manager [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing instance network info cache due to event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.400 226890 DEBUG oslo_concurrency.lockutils [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.464 226890 DEBUG nova.objects.instance [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lazy-loading 'migration_context' on Instance uuid c59206c3-51bd-4f98-a12c-d24e73739926 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.479 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.479 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Ensure instance console log exists: /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.479 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.480 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.480 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.481 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.485 226890 WARNING nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.489 226890 DEBUG nova.virt.libvirt.host [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.490 226890 DEBUG nova.virt.libvirt.host [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.493 226890 DEBUG nova.virt.libvirt.host [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.494 226890 DEBUG nova.virt.libvirt.host [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.494 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.495 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.495 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.495 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.495 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.496 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.496 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.496 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.496 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.497 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.497 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.497 226890 DEBUG nova.virt.hardware [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.499 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.760 226890 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updated VIF entry in instance network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.761 226890 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.786 226890 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.787 226890 DEBUG oslo_concurrency.lockutils [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.787 226890 DEBUG nova.network.neutron [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2358003891' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.917 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.955 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:29 np0005588920 nova_compute[226886]: 2026-01-20 14:30:29.961 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:30.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2341563966' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.461 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.463 226890 DEBUG nova.objects.instance [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lazy-loading 'pci_devices' on Instance uuid c59206c3-51bd-4f98-a12c-d24e73739926 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.505 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <uuid>c59206c3-51bd-4f98-a12c-d24e73739926</uuid>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <name>instance-00000020</name>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerDiagnosticsTest-server-784091485</nova:name>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:30:29</nova:creationTime>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:user uuid="86df2c2ba7e34562aecf2f9566e46dc0">tempest-ServerDiagnosticsTest-287683621-project-member</nova:user>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <nova:project uuid="4ac25c78ebe1428c9034578c09eec31e">tempest-ServerDiagnosticsTest-287683621</nova:project>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="serial">c59206c3-51bd-4f98-a12c-d24e73739926</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="uuid">c59206c3-51bd-4f98-a12c-d24e73739926</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c59206c3-51bd-4f98-a12c-d24e73739926_disk">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c59206c3-51bd-4f98-a12c-d24e73739926_disk.config">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/console.log" append="off"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:30:30 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:30:30 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:30:30 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:30:30 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.558 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.558 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.558 226890 INFO nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Using config drive#033[00m
Jan 20 09:30:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:30.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.588 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.782 226890 INFO nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Creating config drive at /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.792 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ndtlbro execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.927 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ndtlbro" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.978 226890 DEBUG nova.storage.rbd_utils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] rbd image c59206c3-51bd-4f98-a12c-d24e73739926_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:30 np0005588920 nova_compute[226886]: 2026-01-20 14:30:30.983 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config c59206c3-51bd-4f98-a12c-d24e73739926_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.105 226890 DEBUG nova.network.neutron [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updated VIF entry in instance network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.106 226890 DEBUG nova.network.neutron [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.125 226890 DEBUG oslo_concurrency.lockutils [req-ab317762-804a-439c-a194-c6ebb6ce5f88 req-530c2f17-8db7-4078-a0f2-563f123af08b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.185 226890 DEBUG oslo_concurrency.processutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config c59206c3-51bd-4f98-a12c-d24e73739926_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.202s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.186 226890 INFO nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Deleting local config drive /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926/disk.config because it was imported into RBD.#033[00m
Jan 20 09:30:31 np0005588920 systemd-machined[196121]: New machine qemu-13-instance-00000020.
Jan 20 09:30:31 np0005588920 systemd[1]: Started Virtual Machine qemu-13-instance-00000020.
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.849 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919431.8489223, c59206c3-51bd-4f98-a12c-d24e73739926 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.851 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.854 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.855 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.860 226890 INFO nova.virt.libvirt.driver [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance spawned successfully.#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.861 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.891 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.900 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.907 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.908 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.909 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.910 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.910 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.911 226890 DEBUG nova.virt.libvirt.driver [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.919 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.920 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919431.8503196, c59206c3-51bd-4f98-a12c-d24e73739926 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.920 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.944 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.948 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.977 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.987 226890 INFO nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Took 4.71 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:30:31 np0005588920 nova_compute[226886]: 2026-01-20 14:30:31.988 226890 DEBUG nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:32 np0005588920 nova_compute[226886]: 2026-01-20 14:30:32.053 226890 INFO nova.compute.manager [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Took 5.77 seconds to build instance.#033[00m
Jan 20 09:30:32 np0005588920 nova_compute[226886]: 2026-01-20 14:30:32.075 226890 DEBUG oslo_concurrency.lockutils [None req-f98538aa-394c-4f67-b3b4-044bd9c0d9c1 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:32.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:32 np0005588920 nova_compute[226886]: 2026-01-20 14:30:32.525 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:32.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:32 np0005588920 nova_compute[226886]: 2026-01-20 14:30:32.860 226890 DEBUG nova.compute.manager [None req-343dedda-14ef-4b20-98aa-c05bf1526105 fb066eb115a5479586f47ca371bba461 4cef207e759c44ec9537cb9b06aec2c8 - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:32 np0005588920 nova_compute[226886]: 2026-01-20 14:30:32.864 226890 INFO nova.compute.manager [None req-343dedda-14ef-4b20-98aa-c05bf1526105 fb066eb115a5479586f47ca371bba461 4cef207e759c44ec9537cb9b06aec2c8 - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Retrieving diagnostics#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.550 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "c59206c3-51bd-4f98-a12c-d24e73739926" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.551 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.551 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "c59206c3-51bd-4f98-a12c-d24e73739926-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.551 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.551 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.553 226890 INFO nova.compute.manager [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Terminating instance#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.554 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "refresh_cache-c59206c3-51bd-4f98-a12c-d24e73739926" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.554 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquired lock "refresh_cache-c59206c3-51bd-4f98-a12c-d24e73739926" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:33 np0005588920 nova_compute[226886]: 2026-01-20 14:30:33.554 226890 DEBUG nova.network.neutron [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:34 np0005588920 nova_compute[226886]: 2026-01-20 14:30:34.023 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:34.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:34 np0005588920 nova_compute[226886]: 2026-01-20 14:30:34.477 226890 DEBUG nova.network.neutron [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:34.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:35 np0005588920 nova_compute[226886]: 2026-01-20 14:30:35.459 226890 DEBUG nova.network.neutron [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:35 np0005588920 nova_compute[226886]: 2026-01-20 14:30:35.475 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Releasing lock "refresh_cache-c59206c3-51bd-4f98-a12c-d24e73739926" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:35 np0005588920 nova_compute[226886]: 2026-01-20 14:30:35.475 226890 DEBUG nova.compute.manager [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:30:35 np0005588920 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 20 09:30:35 np0005588920 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000020.scope: Consumed 4.126s CPU time.
Jan 20 09:30:35 np0005588920 systemd-machined[196121]: Machine qemu-13-instance-00000020 terminated.
Jan 20 09:30:35 np0005588920 nova_compute[226886]: 2026-01-20 14:30:35.901 226890 INFO nova.virt.libvirt.driver [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance destroyed successfully.#033[00m
Jan 20 09:30:35 np0005588920 nova_compute[226886]: 2026-01-20 14:30:35.902 226890 DEBUG nova.objects.instance [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lazy-loading 'resources' on Instance uuid c59206c3-51bd-4f98-a12c-d24e73739926 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:36.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:36.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:37 np0005588920 podman[238835]: 2026-01-20 14:30:37.067076749 +0000 UTC m=+0.142311247 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:30:37 np0005588920 nova_compute[226886]: 2026-01-20 14:30:37.527 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.232 226890 INFO nova.virt.libvirt.driver [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Deleting instance files /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926_del#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.232 226890 INFO nova.virt.libvirt.driver [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Deletion of /var/lib/nova/instances/c59206c3-51bd-4f98-a12c-d24e73739926_del complete#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.292 226890 INFO nova.compute.manager [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Took 2.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.292 226890 DEBUG oslo.service.loopingcall [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.292 226890 DEBUG nova.compute.manager [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.293 226890 DEBUG nova.network.neutron [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:30:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:38.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.468 226890 DEBUG nova.network.neutron [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.487 226890 DEBUG nova.network.neutron [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.500 226890 INFO nova.compute.manager [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Took 0.21 seconds to deallocate network for instance.#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.545 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.546 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:38.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:38 np0005588920 nova_compute[226886]: 2026-01-20 14:30:38.635 226890 DEBUG oslo_concurrency.processutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.027 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:39 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/561453425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.099 226890 DEBUG oslo_concurrency.processutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.106 226890 DEBUG nova.compute.provider_tree [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.136 226890 DEBUG nova.scheduler.client.report [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.169 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.194 226890 INFO nova.scheduler.client.report [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Deleted allocations for instance c59206c3-51bd-4f98-a12c-d24e73739926#033[00m
Jan 20 09:30:39 np0005588920 nova_compute[226886]: 2026-01-20 14:30:39.278 226890 DEBUG oslo_concurrency.lockutils [None req-f626129f-a83d-4bb0-8724-4f14d541324b 86df2c2ba7e34562aecf2f9566e46dc0 4ac25c78ebe1428c9034578c09eec31e - - default default] Lock "c59206c3-51bd-4f98-a12c-d24e73739926" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:40.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:42.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:42 np0005588920 nova_compute[226886]: 2026-01-20 14:30:42.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.049 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:43.049 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:43.050 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.253 226890 DEBUG nova.compute.manager [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.254 226890 DEBUG nova.compute.manager [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing instance network info cache due to event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.254 226890 DEBUG oslo_concurrency.lockutils [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.255 226890 DEBUG oslo_concurrency.lockutils [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:43 np0005588920 nova_compute[226886]: 2026-01-20 14:30:43.255 226890 DEBUG nova.network.neutron [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:44 np0005588920 nova_compute[226886]: 2026-01-20 14:30:44.030 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:44.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:45 np0005588920 nova_compute[226886]: 2026-01-20 14:30:45.012 226890 DEBUG nova.network.neutron [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updated VIF entry in instance network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:45 np0005588920 nova_compute[226886]: 2026-01-20 14:30:45.013 226890 DEBUG nova.network.neutron [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:45 np0005588920 nova_compute[226886]: 2026-01-20 14:30:45.040 226890 DEBUG oslo_concurrency.lockutils [req-65b69232-f380-4052-90e8-5709512dcb8c req-7d09ee87-b413-41c2-a709-3ee3ed889864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:46.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:46.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.864 226890 DEBUG nova.compute.manager [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.865 226890 DEBUG nova.compute.manager [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing instance network info cache due to event network-changed-a3c691ea-b51e-4524-84af-3cbb50dd9a0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.865 226890 DEBUG oslo_concurrency.lockutils [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.865 226890 DEBUG oslo_concurrency.lockutils [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:47 np0005588920 nova_compute[226886]: 2026-01-20 14:30:47.865 226890 DEBUG nova.network.neutron [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Refreshing network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:48.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:49 np0005588920 nova_compute[226886]: 2026-01-20 14:30:49.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:49.052 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:49 np0005588920 podman[238886]: 2026-01-20 14:30:49.992489014 +0000 UTC m=+0.073142757 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.163 226890 DEBUG nova.network.neutron [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updated VIF entry in instance network info cache for port a3c691ea-b51e-4524-84af-3cbb50dd9a0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.164 226890 DEBUG nova.network.neutron [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [{"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.178 226890 DEBUG oslo_concurrency.lockutils [req-ce7310d9-d065-4fd9-8a5c-f3da6b6a0ce4 req-59b48c44-613f-4d72-bbfa-c4e992001327 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ad1be106-796f-45ef-8eb7-afa4c072b371" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:50.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:30:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:50.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.898 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919435.897653, c59206c3-51bd-4f98-a12c-d24e73739926 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.899 226890 INFO nova.compute.manager [-] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:30:50 np0005588920 nova_compute[226886]: 2026-01-20 14:30:50.930 226890 DEBUG nova.compute.manager [None req-d0bf73bc-b8c8-473d-bf55-f78a3bcd94ac - - - - - -] [instance: c59206c3-51bd-4f98-a12c-d24e73739926] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:52.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.535 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:52.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.646 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.647 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.647 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.647 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.648 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.649 226890 INFO nova.compute.manager [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Terminating instance#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.649 226890 DEBUG nova.compute.manager [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:30:52 np0005588920 kernel: tapa3c691ea-b5 (unregistering): left promiscuous mode
Jan 20 09:30:52 np0005588920 NetworkManager[49076]: <info>  [1768919452.7082] device (tapa3c691ea-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:30:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:52Z|00089|binding|INFO|Releasing lport a3c691ea-b51e-4524-84af-3cbb50dd9a0c from this chassis (sb_readonly=0)
Jan 20 09:30:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:52Z|00090|binding|INFO|Setting lport a3c691ea-b51e-4524-84af-3cbb50dd9a0c down in Southbound
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.716 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:30:52Z|00091|binding|INFO|Removing iface tapa3c691ea-b5 ovn-installed in OVS
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.717 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:52.721 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:26:e5 10.100.0.13'], port_security=['fa:16:3e:03:26:e5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ad1be106-796f-45ef-8eb7-afa4c072b371', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e6deef-9aca-4d36-8215-4517982a86a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f151250c04467bb4f6a229dda16fc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dd36e8d2-993a-4618-8fff-62abafaadfd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86700b79-bb44-47f0-88a5-d4c8eda3acbb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=a3c691ea-b51e-4524-84af-3cbb50dd9a0c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:52.722 144128 INFO neutron.agent.ovn.metadata.agent [-] Port a3c691ea-b51e-4524-84af-3cbb50dd9a0c in datapath 01e6deef-9aca-4d36-8215-4517982a86a3 unbound from our chassis#033[00m
Jan 20 09:30:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:52.723 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01e6deef-9aca-4d36-8215-4517982a86a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:30:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:52.724 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a41c2a57-294c-4142-916e-75c921f720a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:52.724 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 namespace which is not needed anymore#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 20 09:30:52 np0005588920 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Consumed 15.542s CPU time.
Jan 20 09:30:52 np0005588920 systemd-machined[196121]: Machine qemu-12-instance-0000001c terminated.
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.866 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.879 226890 INFO nova.virt.libvirt.driver [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Instance destroyed successfully.#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.880 226890 DEBUG nova.objects.instance [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'resources' on Instance uuid ad1be106-796f-45ef-8eb7-afa4c072b371 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.894 226890 DEBUG nova.virt.libvirt.vif [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1204930815',display_name='tempest-FloatingIPsAssociationTestJSON-server-1204930815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1204930815',id=28,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:30:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-rz2ag1j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:30:01Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=ad1be106-796f-45ef-8eb7-afa4c072b371,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.895 226890 DEBUG nova.network.os_vif_util [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "address": "fa:16:3e:03:26:e5", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3c691ea-b5", "ovs_interfaceid": "a3c691ea-b51e-4524-84af-3cbb50dd9a0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.895 226890 DEBUG nova.network.os_vif_util [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.896 226890 DEBUG os_vif [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.897 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.897 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3c691ea-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:52 np0005588920 nova_compute[226886]: 2026-01-20 14:30:52.903 226890 INFO os_vif [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:26:e5,bridge_name='br-int',has_traffic_filtering=True,id=a3c691ea-b51e-4524-84af-3cbb50dd9a0c,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3c691ea-b5')#033[00m
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [NOTICE]   (238118) : haproxy version is 2.8.14-c23fe91
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [NOTICE]   (238118) : path to executable is /usr/sbin/haproxy
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [WARNING]  (238118) : Exiting Master process...
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [WARNING]  (238118) : Exiting Master process...
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [ALERT]    (238118) : Current worker (238120) exited with code 143 (Terminated)
Jan 20 09:30:53 np0005588920 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[238114]: [WARNING]  (238118) : All workers exited. Exiting... (0)
Jan 20 09:30:53 np0005588920 systemd[1]: libpod-cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd.scope: Deactivated successfully.
Jan 20 09:30:53 np0005588920 podman[238929]: 2026-01-20 14:30:53.051865447 +0000 UTC m=+0.239690711 container died cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:30:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd-userdata-shm.mount: Deactivated successfully.
Jan 20 09:30:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay-f9fe54e6f5c163d14a3f7d4740b1e71daaabb5535ace054c695be911191d57a0-merged.mount: Deactivated successfully.
Jan 20 09:30:53 np0005588920 podman[238929]: 2026-01-20 14:30:53.090782724 +0000 UTC m=+0.278607988 container cleanup cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:30:53 np0005588920 systemd[1]: libpod-conmon-cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd.scope: Deactivated successfully.
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.162 226890 DEBUG nova.compute.manager [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-unplugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.162 226890 DEBUG oslo_concurrency.lockutils [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.162 226890 DEBUG oslo_concurrency.lockutils [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.163 226890 DEBUG oslo_concurrency.lockutils [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.163 226890 DEBUG nova.compute.manager [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] No waiting events found dispatching network-vif-unplugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.163 226890 DEBUG nova.compute.manager [req-0b824f75-2139-4cbc-915f-ef864b1f682d req-22e13fd0-cfe1-4a84-84c3-d5cae9af2eb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-unplugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:30:53 np0005588920 podman[238984]: 2026-01-20 14:30:53.288694484 +0000 UTC m=+0.171599223 container remove cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.294 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[64b94547-1e37-46f5-b72a-f0fecd4fdfce]: (4, ('Tue Jan 20 02:30:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 (cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd)\ncbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd\nTue Jan 20 02:30:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 (cbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd)\ncbde9a9c5849b2491e8b553fc8de203b1d7b307740c8313efc170344a4dc92cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.296 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1052fb16-4260-4373-89de-eecd8005ad68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.297 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e6deef-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:53 np0005588920 kernel: tap01e6deef-90: left promiscuous mode
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.317 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[10a51659-4d97-4de7-a991-bd73e8ea2b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.333 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ca510dc1-4127-4234-ac9c-9a7037a7a102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.335 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0900572f-a812-43c4-ab31-fe7df4d498ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.351 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e4db9758-f6e7-4f16-a642-7ca85087af45]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444166, 'reachable_time': 35937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239000, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.353 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:30:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:30:53.354 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9b7cf7-930b-4798-97ca-1082cb6c033a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:53 np0005588920 systemd[1]: run-netns-ovnmeta\x2d01e6deef\x2d9aca\x2d4d36\x2d8215\x2d4517982a86a3.mount: Deactivated successfully.
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.597 226890 INFO nova.virt.libvirt.driver [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Deleting instance files /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371_del#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.598 226890 INFO nova.virt.libvirt.driver [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Deletion of /var/lib/nova/instances/ad1be106-796f-45ef-8eb7-afa4c072b371_del complete#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.677 226890 INFO nova.compute.manager [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Took 1.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.678 226890 DEBUG oslo.service.loopingcall [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.679 226890 DEBUG nova.compute.manager [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:30:53 np0005588920 nova_compute[226886]: 2026-01-20 14:30:53.679 226890 DEBUG nova.network.neutron [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:54.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:54.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.811 226890 DEBUG nova.network.neutron [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.830 226890 INFO nova.compute.manager [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Took 1.15 seconds to deallocate network for instance.#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.886 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.887 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.930 226890 DEBUG nova.compute.manager [req-618f4c61-7a48-42e5-8171-af94746431d1 req-cd743923-fb8f-4e52-a4ae-c181b4e05171 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-deleted-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:54 np0005588920 nova_compute[226886]: 2026-01-20 14:30:54.974 226890 DEBUG oslo_concurrency.processutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.271 226890 DEBUG nova.compute.manager [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.272 226890 DEBUG oslo_concurrency.lockutils [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.272 226890 DEBUG oslo_concurrency.lockutils [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.273 226890 DEBUG oslo_concurrency.lockutils [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.273 226890 DEBUG nova.compute.manager [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] No waiting events found dispatching network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.273 226890 WARNING nova.compute.manager [req-e4aeefd5-1405-4486-98c9-aedfdbab8642 req-ee261fa1-9e6f-4b4a-8718-e5124255bf18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Received unexpected event network-vif-plugged-a3c691ea-b51e-4524-84af-3cbb50dd9a0c for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:30:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:55 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1852034554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.378 226890 DEBUG oslo_concurrency.processutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.384 226890 DEBUG nova.compute.provider_tree [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.412 226890 DEBUG nova.scheduler.client.report [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.442 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.478 226890 INFO nova.scheduler.client.report [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Deleted allocations for instance ad1be106-796f-45ef-8eb7-afa4c072b371#033[00m
Jan 20 09:30:55 np0005588920 nova_compute[226886]: 2026-01-20 14:30:55.558 226890 DEBUG oslo_concurrency.lockutils [None req-6a09b217-d218-46e7-930d-3ef98d569db1 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "ad1be106-796f-45ef-8eb7-afa4c072b371" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:56.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 20 09:30:57 np0005588920 nova_compute[226886]: 2026-01-20 14:30:57.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:58.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:30:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:58.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:58 np0005588920 nova_compute[226886]: 2026-01-20 14:30:58.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:59 np0005588920 nova_compute[226886]: 2026-01-20 14:30:59.140 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:59 np0005588920 nova_compute[226886]: 2026-01-20 14:30:59.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.141 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.142 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.142 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.142 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.142 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.144 226890 INFO nova.compute.manager [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Terminating instance#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.145 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.145 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.146 226890 DEBUG nova.network.neutron [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:31:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:00.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:00 np0005588920 nova_compute[226886]: 2026-01-20 14:31:00.473 226890 DEBUG nova.network.neutron [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:31:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:01 np0005588920 nova_compute[226886]: 2026-01-20 14:31:01.308 226890 DEBUG nova.network.neutron [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:01 np0005588920 nova_compute[226886]: 2026-01-20 14:31:01.368 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:31:01 np0005588920 nova_compute[226886]: 2026-01-20 14:31:01.369 226890 DEBUG nova.compute.manager [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:31:01 np0005588920 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 20 09:31:01 np0005588920 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000018.scope: Consumed 17.570s CPU time.
Jan 20 09:31:01 np0005588920 systemd-machined[196121]: Machine qemu-11-instance-00000018 terminated.
Jan 20 09:31:01 np0005588920 nova_compute[226886]: 2026-01-20 14:31:01.593 226890 INFO nova.virt.libvirt.driver [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance destroyed successfully.#033[00m
Jan 20 09:31:01 np0005588920 nova_compute[226886]: 2026-01-20 14:31:01.593 226890 DEBUG nova.objects.instance [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:02.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:31:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:02.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:31:02 np0005588920 nova_compute[226886]: 2026-01-20 14:31:02.904 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.406 226890 INFO nova.virt.libvirt.driver [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Deleting instance files /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9_del#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.407 226890 INFO nova.virt.libvirt.driver [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Deletion of /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9_del complete#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.640 226890 INFO nova.compute.manager [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Took 2.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.641 226890 DEBUG oslo.service.loopingcall [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.641 226890 DEBUG nova.compute.manager [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:31:03 np0005588920 nova_compute[226886]: 2026-01-20 14:31:03.641 226890 DEBUG nova.network.neutron [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:04.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.478 226890 DEBUG nova.network.neutron [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:31:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.791 226890 DEBUG nova.network.neutron [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.896 226890 INFO nova.compute.manager [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.956 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:04 np0005588920 nova_compute[226886]: 2026-01-20 14:31:04.956 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.029 226890 DEBUG oslo_concurrency.processutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3548925211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.603 226890 DEBUG oslo_concurrency.processutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.574s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.613 226890 DEBUG nova.compute.provider_tree [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.707 226890 DEBUG nova.scheduler.client.report [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.772 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.813 226890 INFO nova.scheduler.client.report [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Deleted allocations for instance 87fe16d6-774e-4002-8df4-9eb202621ab9#033[00m
Jan 20 09:31:05 np0005588920 nova_compute[226886]: 2026-01-20 14:31:05.927 226890 DEBUG oslo_concurrency.lockutils [None req-ac3c2529-c51c-4b50-801b-9569a7dee8e9 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:06.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:07 np0005588920 nova_compute[226886]: 2026-01-20 14:31:07.878 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919452.8778899, ad1be106-796f-45ef-8eb7-afa4c072b371 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:07 np0005588920 nova_compute[226886]: 2026-01-20 14:31:07.879 226890 INFO nova.compute.manager [-] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:31:07 np0005588920 nova_compute[226886]: 2026-01-20 14:31:07.905 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:07 np0005588920 nova_compute[226886]: 2026-01-20 14:31:07.978 226890 DEBUG nova.compute.manager [None req-20858f35-607f-4e02-82d3-a3265baef6d7 - - - - - -] [instance: ad1be106-796f-45ef-8eb7-afa4c072b371] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:08 np0005588920 podman[239068]: 2026-01-20 14:31:08.050137643 +0000 UTC m=+0.137328222 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 20 09:31:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:08.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:08.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:09 np0005588920 nova_compute[226886]: 2026-01-20 14:31:09.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:10.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:12.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:12 np0005588920 nova_compute[226886]: 2026-01-20 14:31:12.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:31:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/86488182' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:31:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:31:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/86488182' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:31:14 np0005588920 nova_compute[226886]: 2026-01-20 14:31:14.177 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:14.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:14.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:16.433 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:16.433 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:16.434 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:31:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:16.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:31:16 np0005588920 nova_compute[226886]: 2026-01-20 14:31:16.592 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919461.590599, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:16 np0005588920 nova_compute[226886]: 2026-01-20 14:31:16.592 226890 INFO nova.compute.manager [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:31:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:16.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:17 np0005588920 nova_compute[226886]: 2026-01-20 14:31:17.080 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:17 np0005588920 nova_compute[226886]: 2026-01-20 14:31:17.080 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:31:17 np0005588920 nova_compute[226886]: 2026-01-20 14:31:17.932 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:18 np0005588920 nova_compute[226886]: 2026-01-20 14:31:18.341 226890 DEBUG nova.compute.manager [None req-bc4906af-d66a-4383-879b-215db5b839f7 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:18 np0005588920 nova_compute[226886]: 2026-01-20 14:31:18.372 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:31:18 np0005588920 nova_compute[226886]: 2026-01-20 14:31:18.373 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:18 np0005588920 nova_compute[226886]: 2026-01-20 14:31:18.374 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:18 np0005588920 nova_compute[226886]: 2026-01-20 14:31:18.374 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:18.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:18.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:19 np0005588920 nova_compute[226886]: 2026-01-20 14:31:19.179 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:19 np0005588920 nova_compute[226886]: 2026-01-20 14:31:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:19 np0005588920 nova_compute[226886]: 2026-01-20 14:31:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:19 np0005588920 nova_compute[226886]: 2026-01-20 14:31:19.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:31:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:20.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:20.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:20 np0005588920 podman[239119]: 2026-01-20 14:31:20.964396742 +0000 UTC m=+0.073831257 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:31:21 np0005588920 podman[239285]: 2026-01-20 14:31:21.506294039 +0000 UTC m=+0.073715394 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 09:31:21 np0005588920 nova_compute[226886]: 2026-01-20 14:31:21.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:21.597 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:21 np0005588920 podman[239285]: 2026-01-20 14:31:21.599282255 +0000 UTC m=+0.166703580 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:31:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:21.598 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:31:21 np0005588920 nova_compute[226886]: 2026-01-20 14:31:21.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:22.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:22.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:22 np0005588920 nova_compute[226886]: 2026-01-20 14:31:22.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:22 np0005588920 nova_compute[226886]: 2026-01-20 14:31:22.934 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.765 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.765 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.765 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.765 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:31:23 np0005588920 nova_compute[226886]: 2026-01-20 14:31:23.766 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4199824210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.212 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.382 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.383 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4801MB free_disk=20.89706039428711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.383 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.384 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.454 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.454 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:31:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:24.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.482 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:24.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/183400782' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.886 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.892 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.912 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.930 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:31:24 np0005588920 nova_compute[226886]: 2026-01-20 14:31:24.930 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:31:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1937716485' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:25.601 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:26.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:26.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:27 np0005588920 nova_compute[226886]: 2026-01-20 14:31:27.978 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:28.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:28.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:29 np0005588920 nova_compute[226886]: 2026-01-20 14:31:29.247 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:30.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:30.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:32.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:32.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:33 np0005588920 nova_compute[226886]: 2026-01-20 14:31:33.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:34 np0005588920 nova_compute[226886]: 2026-01-20 14:31:34.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:34.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:34.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:36.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:36.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.056 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.056 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.091 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.644 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.645 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.651 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:31:37 np0005588920 nova_compute[226886]: 2026-01-20 14:31:37.651 226890 INFO nova.compute.claims [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.062 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:38.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.518 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.518 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.563 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.586 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:38 np0005588920 nova_compute[226886]: 2026-01-20 14:31:38.657 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:38.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:38 np0005588920 podman[239655]: 2026-01-20 14:31:38.985340098 +0000 UTC m=+0.074824486 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:31:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/246380524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:39 np0005588920 nova_compute[226886]: 2026-01-20 14:31:39.013 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:39 np0005588920 nova_compute[226886]: 2026-01-20 14:31:39.018 226890 DEBUG nova.compute.provider_tree [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:39 np0005588920 nova_compute[226886]: 2026-01-20 14:31:39.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:31:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:40.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:31:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:40.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.664 226890 DEBUG nova.scheduler.client.report [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.889 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.890 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.892 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 3.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.908 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:31:41 np0005588920 nova_compute[226886]: 2026-01-20 14:31:41.909 226890 INFO nova.compute.claims [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.109 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.167 226890 INFO nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.212 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.326 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:42.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.632 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.633 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.634 226890 INFO nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating image(s)#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.657 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.687 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:42.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.719 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.722 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3784413517' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.772 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.778 226890 DEBUG nova.compute.provider_tree [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.805 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.805 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.806 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.806 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.840 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:42 np0005588920 nova_compute[226886]: 2026-01-20 14:31:42.843 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.170 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.246 226890 DEBUG nova.scheduler.client.report [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.256 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] resizing rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.302 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.303 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.379 226890 DEBUG nova.objects.instance [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'migration_context' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.407 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.407 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.528 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.528 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Ensure instance console log exists: /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.529 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.529 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.529 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.530 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.535 226890 WARNING nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.538 226890 DEBUG nova.virt.libvirt.host [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.539 226890 DEBUG nova.virt.libvirt.host [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.542 226890 DEBUG nova.virt.libvirt.host [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.542 226890 DEBUG nova.virt.libvirt.host [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.543 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.544 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.544 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.544 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.544 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.544 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.545 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.545 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.545 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.545 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.545 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.546 226890 DEBUG nova.virt.hardware [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.548 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.576 226890 INFO nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.620 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.770 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.771 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.772 226890 INFO nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Creating image(s)#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.798 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.827 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.854 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.857 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.919 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.920 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.920 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.920 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.949 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:43 np0005588920 nova_compute[226886]: 2026-01-20 14:31:43.953 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 57c49518-7381-4e7d-975f-9c6afc3ea966_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:31:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2511464146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.002 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.037 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.041 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.156 226890 DEBUG nova.policy [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd0483da65fa4225846c0cc91e8e0275', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac0564cb541e4c679c6d282fd454c05d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.197 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 57c49518-7381-4e7d-975f-9c6afc3ea966_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.244s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.279 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] resizing rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.386 226890 DEBUG nova.objects.instance [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lazy-loading 'migration_context' on Instance uuid 57c49518-7381-4e7d-975f-9c6afc3ea966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.415 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.415 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Ensure instance console log exists: /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.416 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.416 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.416 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:44.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:31:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1240311285' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.538 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.539 226890 DEBUG nova.objects.instance [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'pci_devices' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.559 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <uuid>ebc2b8c3-8d9f-4798-8865-dd256233f4fc</uuid>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <name>instance-00000022</name>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAdmin275Test-server-625178373</nova:name>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:31:43</nova:creationTime>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:user uuid="3d51de2ad98d40d8ad12305518d106fd">tempest-ServersAdmin275Test-1570927802-project-member</nova:user>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <nova:project uuid="389fafaa99e14f31988005de907401bf">tempest-ServersAdmin275Test-1570927802</nova:project>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="serial">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="uuid">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log" append="off"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:31:44 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:31:44 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:31:44 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:31:44 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:31:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.672 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.672 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.672 226890 INFO nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Using config drive#033[00m
Jan 20 09:31:44 np0005588920 nova_compute[226886]: 2026-01-20 14:31:44.690 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:44.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.406 226890 INFO nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating config drive at /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.412 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_wh4dgbo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.544 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_wh4dgbo" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.568 226890 DEBUG nova.storage.rbd_utils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.572 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.717 226890 DEBUG oslo_concurrency.processutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.718 226890 INFO nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting local config drive /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config because it was imported into RBD.#033[00m
Jan 20 09:31:45 np0005588920 systemd-machined[196121]: New machine qemu-14-instance-00000022.
Jan 20 09:31:45 np0005588920 systemd[1]: Started Virtual Machine qemu-14-instance-00000022.
Jan 20 09:31:45 np0005588920 nova_compute[226886]: 2026-01-20 14:31:45.921 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Successfully created port: 7554fcf8-fae8-4efb-aa74-e25896763129 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.444 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919506.4442546, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.445 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.447 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.448 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.451 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance spawned successfully.#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.452 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:31:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:46.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.514 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.515 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.515 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.515 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.516 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.516 226890 DEBUG nova.virt.libvirt.driver [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.578 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.581 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.613 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.614 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919506.446883, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.614 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Started (Lifecycle Event)#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.627 226890 INFO nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Took 3.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.627 226890 DEBUG nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.635 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.637 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.665 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:31:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:46.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.707 226890 INFO nova.compute.manager [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Took 9.09 seconds to build instance.#033[00m
Jan 20 09:31:46 np0005588920 nova_compute[226886]: 2026-01-20 14:31:46.728 226890 DEBUG oslo_concurrency.lockutils [None req-0013e947-5884-4775-8575-75f2a5129eac 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.067 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:48.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.782 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Successfully updated port: 7554fcf8-fae8-4efb-aa74-e25896763129 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.801 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.802 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquired lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.802 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.927 226890 DEBUG nova.compute.manager [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.927 226890 DEBUG nova.compute.manager [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing instance network info cache due to event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:31:48 np0005588920 nova_compute[226886]: 2026-01-20 14:31:48.927 226890 DEBUG oslo_concurrency.lockutils [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:31:49 np0005588920 nova_compute[226886]: 2026-01-20 14:31:49.175 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:31:49 np0005588920 nova_compute[226886]: 2026-01-20 14:31:49.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.021 226890 INFO nova.compute.manager [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Rebuilding instance#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.296 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.325 226890 DEBUG nova.compute.manager [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.396 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'pci_requests' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.410 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'pci_devices' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.432 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'resources' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.453 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'migration_context' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.476 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:31:50 np0005588920 nova_compute[226886]: 2026-01-20 14:31:50.480 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:31:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:50.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:50.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:51 np0005588920 podman[240215]: 2026-01-20 14:31:51.980157669 +0000 UTC m=+0.065887775 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.216 226890 DEBUG nova.network.neutron [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.269 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Releasing lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.269 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Instance network_info: |[{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.270 226890 DEBUG oslo_concurrency.lockutils [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.270 226890 DEBUG nova.network.neutron [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.273 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Start _get_guest_xml network_info=[{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.277 226890 WARNING nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.283 226890 DEBUG nova.virt.libvirt.host [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.284 226890 DEBUG nova.virt.libvirt.host [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.287 226890 DEBUG nova.virt.libvirt.host [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.287 226890 DEBUG nova.virt.libvirt.host [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.288 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.288 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.289 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.289 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.289 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.289 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.289 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.290 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.290 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.290 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.290 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.291 226890 DEBUG nova.virt.hardware [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.293 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:52.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4014400721' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4014400721' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:31:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3938435753' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:31:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:52.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.720 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.743 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:52 np0005588920 nova_compute[226886]: 2026-01-20 14:31:52.747 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:31:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/715650878' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.147 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.149 226890 DEBUG nova.virt.libvirt.vif [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-617899621',id=35,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac0564cb541e4c679c6d282fd454c05d',ramdisk_id='',reservation_id='r-crhbpp8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:31:43Z,user_data=None,user_id='dd0483da65fa4225846c0cc91e8e0275',uuid=57c49518-7381-4e7d-975f-9c6afc3ea966,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.150 226890 DEBUG nova.network.os_vif_util [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converting VIF {"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.151 226890 DEBUG nova.network.os_vif_util [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.153 226890 DEBUG nova.objects.instance [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lazy-loading 'pci_devices' on Instance uuid 57c49518-7381-4e7d-975f-9c6afc3ea966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.177 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <uuid>57c49518-7381-4e7d-975f-9c6afc3ea966</uuid>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <name>instance-00000023</name>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621</nova:name>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:31:52</nova:creationTime>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:user uuid="dd0483da65fa4225846c0cc91e8e0275">tempest-FloatingIPsAssociationNegativeTestJSON-1763144881-project-member</nova:user>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:project uuid="ac0564cb541e4c679c6d282fd454c05d">tempest-FloatingIPsAssociationNegativeTestJSON-1763144881</nova:project>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <nova:port uuid="7554fcf8-fae8-4efb-aa74-e25896763129">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="serial">57c49518-7381-4e7d-975f-9c6afc3ea966</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="uuid">57c49518-7381-4e7d-975f-9c6afc3ea966</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/57c49518-7381-4e7d-975f-9c6afc3ea966_disk">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1d:31:c5"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <target dev="tap7554fcf8-fa"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/console.log" append="off"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:31:53 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:31:53 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:31:53 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:31:53 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.180 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Preparing to wait for external event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.180 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.181 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.181 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.183 226890 DEBUG nova.virt.libvirt.vif [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-617899621',id=35,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac0564cb541e4c679c6d282fd454c05d',ramdisk_id='',reservation_id='r-crhbpp8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:31:43Z,user_data=None,user_id='dd0483da65fa4225846c0cc91e8e0275',uuid=57c49518-7381-4e7d-975f-9c6afc3ea966,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.183 226890 DEBUG nova.network.os_vif_util [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converting VIF {"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.184 226890 DEBUG nova.network.os_vif_util [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.185 226890 DEBUG os_vif [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.186 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.186 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.187 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.191 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7554fcf8-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.192 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7554fcf8-fa, col_values=(('external_ids', {'iface-id': '7554fcf8-fae8-4efb-aa74-e25896763129', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:31:c5', 'vm-uuid': '57c49518-7381-4e7d-975f-9c6afc3ea966'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.194 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:53 np0005588920 NetworkManager[49076]: <info>  [1768919513.1958] manager: (tap7554fcf8-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.202 226890 INFO os_vif [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa')#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.263 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.264 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.264 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] No VIF found with MAC fa:16:3e:1d:31:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.264 226890 INFO nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Using config drive#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.287 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.926 226890 INFO nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Creating config drive at /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config#033[00m
Jan 20 09:31:53 np0005588920 nova_compute[226886]: 2026-01-20 14:31:53.932 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwig5hw3z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.063 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwig5hw3z" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.091 226890 DEBUG nova.storage.rbd_utils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] rbd image 57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.094 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config 57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.232 226890 DEBUG oslo_concurrency.processutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config 57c49518-7381-4e7d-975f-9c6afc3ea966_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.233 226890 INFO nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Deleting local config drive /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966/disk.config because it was imported into RBD.#033[00m
Jan 20 09:31:54 np0005588920 kernel: tap7554fcf8-fa: entered promiscuous mode
Jan 20 09:31:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:31:54Z|00092|binding|INFO|Claiming lport 7554fcf8-fae8-4efb-aa74-e25896763129 for this chassis.
Jan 20 09:31:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:31:54Z|00093|binding|INFO|7554fcf8-fae8-4efb-aa74-e25896763129: Claiming fa:16:3e:1d:31:c5 10.100.0.9
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.275 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.2756] manager: (tap7554fcf8-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 20 09:31:54 np0005588920 systemd-udevd[240368]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:31:54 np0005588920 systemd-machined[196121]: New machine qemu-15-instance-00000023.
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.3065] device (tap7554fcf8-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.3073] device (tap7554fcf8-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:31:54 np0005588920 systemd[1]: Started Virtual Machine qemu-15-instance-00000023.
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.351 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:31:54Z|00094|binding|INFO|Setting lport 7554fcf8-fae8-4efb-aa74-e25896763129 ovn-installed in OVS
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:31:54Z|00095|binding|INFO|Setting lport 7554fcf8-fae8-4efb-aa74-e25896763129 up in Southbound
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.388 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:31:c5 10.100.0.9'], port_security=['fa:16:3e:1d:31:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '57c49518-7381-4e7d-975f-9c6afc3ea966', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42267411-7489-4be1-bea9-a5a8c37215df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac0564cb541e4c679c6d282fd454c05d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18113a43-33ad-4733-bb67-ffdaa13f93ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9d97f7f-5f3a-4a28-99e4-dc5cb30f9bdc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7554fcf8-fae8-4efb-aa74-e25896763129) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.389 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7554fcf8-fae8-4efb-aa74-e25896763129 in datapath 42267411-7489-4be1-bea9-a5a8c37215df bound to our chassis#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.390 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42267411-7489-4be1-bea9-a5a8c37215df#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.399 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5aeee1b0-9b7b-4495-9cde-cab96720540c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.400 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42267411-71 in ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.402 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42267411-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.402 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9222af1-ee65-4640-8971-83cb7a6eced1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.403 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d32aad-98b9-4ffb-8ae5-cc52853f9c73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.415 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb2d2b2-3a7f-4ca6-8f41-4392d6b7a739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.440 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[97f1c206-6542-4b90-ac51-e344b017c703]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.467 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a75f18d2-10e6-4ea8-b924-1c2481526ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.475 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a6c489-62b8-46e1-852d-8cc3a467bb84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.4763] manager: (tap42267411-70): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 20 09:31:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:54.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.512 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[193b8a81-c8a8-4cb6-877c-857ec576f7d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.514 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[aa74283e-af57-4c1f-b9df-04d292460d0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.5348] device (tap42267411-70): carrier: link connected
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.544 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[de3a6219-d3ac-41d7-831b-fb0e6fd64a09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.562 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0f942d6d-9c94-477a-a57f-adbc49c3682d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42267411-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:86:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455622, 'reachable_time': 33936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240436, 'error': None, 'target': 'ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.577 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4551e4c4-9242-47f7-aa0a-5614d99ce7c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe54:8626'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455622, 'tstamp': 455622}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240440, 'error': None, 'target': 'ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.594 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a6a2e9-754b-4c09-b6fc-4b14abf90195]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42267411-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:86:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455622, 'reachable_time': 33936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240443, 'error': None, 'target': 'ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.626 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1b2904-f760-44fe-8aef-80430596be17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.680 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[23a5a159-f95b-4af8-b0c0-97549daf8736]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.683 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42267411-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.683 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.683 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42267411-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 NetworkManager[49076]: <info>  [1768919514.6859] manager: (tap42267411-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 20 09:31:54 np0005588920 kernel: tap42267411-70: entered promiscuous mode
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.690 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42267411-70, col_values=(('external_ids', {'iface-id': 'e10e1853-9762-4b97-a71b-085a7333bdd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.690 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919514.6899085, 57c49518-7381-4e7d-975f-9c6afc3ea966 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.691 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] VM Started (Lifecycle Event)#033[00m
Jan 20 09:31:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:31:54Z|00096|binding|INFO|Releasing lport e10e1853-9762-4b97-a71b-085a7333bdd9 from this chassis (sb_readonly=0)
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.694 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42267411-7489-4be1-bea9-a5a8c37215df.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42267411-7489-4be1-bea9-a5a8c37215df.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.695 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f98ee0fe-9b5c-4082-b4a3-8a82c2837e3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.695 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-42267411-7489-4be1-bea9-a5a8c37215df
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/42267411-7489-4be1-bea9-a5a8c37215df.pid.haproxy
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 42267411-7489-4be1-bea9-a5a8c37215df
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:31:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:31:54.696 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df', 'env', 'PROCESS_TAG=haproxy-42267411-7489-4be1-bea9-a5a8c37215df', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42267411-7489-4be1-bea9-a5a8c37215df.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.708 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:31:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:54.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.796 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.804 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919514.6910236, 57c49518-7381-4e7d-975f-9c6afc3ea966 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.804 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.836 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.839 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:31:54 np0005588920 nova_compute[226886]: 2026-01-20 14:31:54.867 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.000 226890 DEBUG nova.compute.manager [req-9b583d61-3a95-487e-ac97-eee94988040a req-5ce52ae8-6866-4786-993a-97577203d962 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.001 226890 DEBUG oslo_concurrency.lockutils [req-9b583d61-3a95-487e-ac97-eee94988040a req-5ce52ae8-6866-4786-993a-97577203d962 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.001 226890 DEBUG oslo_concurrency.lockutils [req-9b583d61-3a95-487e-ac97-eee94988040a req-5ce52ae8-6866-4786-993a-97577203d962 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.001 226890 DEBUG oslo_concurrency.lockutils [req-9b583d61-3a95-487e-ac97-eee94988040a req-5ce52ae8-6866-4786-993a-97577203d962 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.002 226890 DEBUG nova.compute.manager [req-9b583d61-3a95-487e-ac97-eee94988040a req-5ce52ae8-6866-4786-993a-97577203d962 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Processing event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.002 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.011 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919515.0064266, 57c49518-7381-4e7d-975f-9c6afc3ea966 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.011 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.012 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.017 226890 INFO nova.virt.libvirt.driver [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Instance spawned successfully.#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.024 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.039 226890 DEBUG nova.network.neutron [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updated VIF entry in instance network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.040 226890 DEBUG nova.network.neutron [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.042 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.045 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.051 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.051 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.052 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.052 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.053 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.053 226890 DEBUG nova.virt.libvirt.driver [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:31:55 np0005588920 podman[240478]: 2026-01-20 14:31:55.07405375 +0000 UTC m=+0.059895060 container create f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.075 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.087 226890 DEBUG oslo_concurrency.lockutils [req-14bc26e2-d889-4344-bde4-29d9f0f561b4 req-112cfe3a-e230-4dad-99b2-0cfb08378cd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:31:55 np0005588920 systemd[1]: Started libpod-conmon-f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6.scope.
Jan 20 09:31:55 np0005588920 podman[240478]: 2026-01-20 14:31:55.042311863 +0000 UTC m=+0.028153173 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:31:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:31:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/062b87c98f50a6083a70429521c4fae820e3d7a1004760c0a37db60ffd08a9ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.166 226890 INFO nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Took 11.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.168 226890 DEBUG nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:55 np0005588920 podman[240478]: 2026-01-20 14:31:55.167541701 +0000 UTC m=+0.153383001 container init f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:31:55 np0005588920 podman[240478]: 2026-01-20 14:31:55.174509444 +0000 UTC m=+0.160350734 container start f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:31:55 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [NOTICE]   (240497) : New worker (240499) forked
Jan 20 09:31:55 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [NOTICE]   (240497) : Loading success.
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.239 226890 INFO nova.compute.manager [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Took 16.62 seconds to build instance.#033[00m
Jan 20 09:31:55 np0005588920 nova_compute[226886]: 2026-01-20 14:31:55.277 226890 DEBUG oslo_concurrency.lockutils [None req-d55a7a3c-6267-4c78-bf2d-cdf22b749352 dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:56.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.444 226890 DEBUG nova.compute.manager [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.445 226890 DEBUG oslo_concurrency.lockutils [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.446 226890 DEBUG oslo_concurrency.lockutils [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.446 226890 DEBUG oslo_concurrency.lockutils [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.446 226890 DEBUG nova.compute.manager [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] No waiting events found dispatching network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:57 np0005588920 nova_compute[226886]: 2026-01-20 14:31:57.446 226890 WARNING nova.compute.manager [req-2529ff9e-50e5-434b-a1df-d8c9aa2560fb req-df1ed654-6729-4eed-b163-4c41acc17b6d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received unexpected event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:31:58 np0005588920 nova_compute[226886]: 2026-01-20 14:31:58.195 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:31:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:58.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:31:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:31:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:58.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:59 np0005588920 nova_compute[226886]: 2026-01-20 14:31:59.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:00.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:00 np0005588920 nova_compute[226886]: 2026-01-20 14:32:00.529 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:32:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:00.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:01.784 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:32:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:01.786 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:32:01 np0005588920 nova_compute[226886]: 2026-01-20 14:32:01.788 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:02.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:02.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:03 np0005588920 nova_compute[226886]: 2026-01-20 14:32:03.199 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:03 np0005588920 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 20 09:32:03 np0005588920 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Consumed 13.096s CPU time.
Jan 20 09:32:03 np0005588920 systemd-machined[196121]: Machine qemu-14-instance-00000022 terminated.
Jan 20 09:32:03 np0005588920 nova_compute[226886]: 2026-01-20 14:32:03.593 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:32:03 np0005588920 nova_compute[226886]: 2026-01-20 14:32:03.601 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance destroyed successfully.#033[00m
Jan 20 09:32:03 np0005588920 nova_compute[226886]: 2026-01-20 14:32:03.607 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance destroyed successfully.#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.058 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting instance files /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.059 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deletion of /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del complete#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:04.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:32:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:04.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.881 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.882 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating image(s)#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.904 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.930 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.954 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.958 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:04 np0005588920 nova_compute[226886]: 2026-01-20 14:32:04.959 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:05 np0005588920 NetworkManager[49076]: <info>  [1768919525.1371] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 20 09:32:05 np0005588920 NetworkManager[49076]: <info>  [1768919525.1384] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.143 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:05 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:05Z|00097|binding|INFO|Releasing lport e10e1853-9762-4b97-a71b-085a7333bdd9 from this chassis (sb_readonly=0)
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:05 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:05Z|00098|binding|INFO|Releasing lport e10e1853-9762-4b97-a71b-085a7333bdd9 from this chassis (sb_readonly=0)
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.549 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.615 226890 DEBUG nova.compute.manager [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.616 226890 DEBUG nova.compute.manager [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing instance network info cache due to event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.616 226890 DEBUG oslo_concurrency.lockutils [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.616 226890 DEBUG oslo_concurrency.lockutils [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.616 226890 DEBUG nova.network.neutron [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:32:05 np0005588920 nova_compute[226886]: 2026-01-20 14:32:05.715 226890 DEBUG nova.virt.libvirt.imagebackend [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/26699514-f465-4b50-98b7-36f2cfc6a308/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/26699514-f465-4b50-98b7-36f2cfc6a308/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:32:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:06.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:06.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.571 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.631 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.634 226890 DEBUG nova.virt.images [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] 26699514-f465-4b50-98b7-36f2cfc6a308 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.635 226890 DEBUG nova.privsep.utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.636 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.790 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.798 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.864 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.866 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.893 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:07 np0005588920 nova_compute[226886]: 2026-01-20 14:32:07.897 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:07 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:07Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1d:31:c5 10.100.0.9
Jan 20 09:32:07 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:07Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:31:c5 10.100.0.9
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.206 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.309s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.269 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] resizing rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.367 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.368 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Ensure instance console log exists: /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.368 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.369 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.369 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.370 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.373 226890 WARNING nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.380 226890 DEBUG nova.virt.libvirt.host [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.381 226890 DEBUG nova.virt.libvirt.host [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.384 226890 DEBUG nova.virt.libvirt.host [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.385 226890 DEBUG nova.virt.libvirt.host [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.386 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.386 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.387 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.387 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.387 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.387 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.388 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.388 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.388 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.388 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.389 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.389 226890 DEBUG nova.virt.hardware [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.389 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'vcpu_model' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.410 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:08.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.534 226890 DEBUG nova.network.neutron [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updated VIF entry in instance network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.535 226890 DEBUG nova.network.neutron [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.555 226890 DEBUG oslo_concurrency.lockutils [req-6c96900b-3f26-4c67-9a73-db49f4455626 req-7a0f03a0-44a3-4293-a043-cc4eb0083932 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:08.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:08.788 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1509807238' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.834 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.857 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:08 np0005588920 nova_compute[226886]: 2026-01-20 14:32:08.861 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3429196109' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.267 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.272 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <uuid>ebc2b8c3-8d9f-4798-8865-dd256233f4fc</uuid>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <name>instance-00000022</name>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAdmin275Test-server-625178373</nova:name>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:32:08</nova:creationTime>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:user uuid="3d51de2ad98d40d8ad12305518d106fd">tempest-ServersAdmin275Test-1570927802-project-member</nova:user>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <nova:project uuid="389fafaa99e14f31988005de907401bf">tempest-ServersAdmin275Test-1570927802</nova:project>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="serial">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="uuid">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log" append="off"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:32:09 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:32:09 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:32:09 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:32:09 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.338 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.339 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.340 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Using config drive#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.375 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.386 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:09 np0005588920 podman[240772]: 2026-01-20 14:32:09.399942128 +0000 UTC m=+0.084070587 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.422 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'ec2_ids' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.483 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'keypairs' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.977 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating config drive at /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config#033[00m
Jan 20 09:32:09 np0005588920 nova_compute[226886]: 2026-01-20 14:32:09.982 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrgrglim execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.126 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrgrglim" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.160 226890 DEBUG nova.storage.rbd_utils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.165 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.341 226890 DEBUG oslo_concurrency.processutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.342 226890 INFO nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting local config drive /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config because it was imported into RBD.#033[00m
Jan 20 09:32:10 np0005588920 systemd-machined[196121]: New machine qemu-16-instance-00000022.
Jan 20 09:32:10 np0005588920 systemd[1]: Started Virtual Machine qemu-16-instance-00000022.
Jan 20 09:32:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:32:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:32:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:10.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.990 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for ebc2b8c3-8d9f-4798-8865-dd256233f4fc due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.991 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919530.98967, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.992 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.997 226890 DEBUG nova.compute.manager [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:32:10 np0005588920 nova_compute[226886]: 2026-01-20 14:32:10.998 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.003 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance spawned successfully.#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.004 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.022 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.030 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.037 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.038 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.039 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.040 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.041 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.042 226890 DEBUG nova.virt.libvirt.driver [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.055 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.056 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919530.9907863, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.056 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Started (Lifecycle Event)#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.075 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.078 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.115 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.140 226890 DEBUG nova.compute.manager [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.227 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.227 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.228 226890 DEBUG nova.objects.instance [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:32:11 np0005588920 nova_compute[226886]: 2026-01-20 14:32:11.332 226890 DEBUG oslo_concurrency.lockutils [None req-ff7d56cb-c559-47a3-9df4-d31ecd7aa8e5 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:12.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:12.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:13 np0005588920 nova_compute[226886]: 2026-01-20 14:32:13.205 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:13 np0005588920 nova_compute[226886]: 2026-01-20 14:32:13.581 226890 INFO nova.compute.manager [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Rebuilding instance#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.101 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'trusted_certs' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.119 226890 DEBUG nova.compute.manager [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.166 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'pci_requests' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.179 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'pci_devices' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.199 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'resources' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.217 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'migration_context' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.233 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.237 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:32:14 np0005588920 nova_compute[226886]: 2026-01-20 14:32:14.359 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:14.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:14.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:16.435 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:16.436 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:16.437 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:16.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:16.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:16 np0005588920 nova_compute[226886]: 2026-01-20 14:32:16.926 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:17 np0005588920 nova_compute[226886]: 2026-01-20 14:32:17.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:17 np0005588920 nova_compute[226886]: 2026-01-20 14:32:17.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:32:17 np0005588920 nova_compute[226886]: 2026-01-20 14:32:17.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:32:18 np0005588920 nova_compute[226886]: 2026-01-20 14:32:18.208 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:18.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:32:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:18.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:32:18 np0005588920 nova_compute[226886]: 2026-01-20 14:32:18.787 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:18 np0005588920 nova_compute[226886]: 2026-01-20 14:32:18.788 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:18 np0005588920 nova_compute[226886]: 2026-01-20 14:32:18.788 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:32:18 np0005588920 nova_compute[226886]: 2026-01-20 14:32:18.788 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 57c49518-7381-4e7d-975f-9c6afc3ea966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:19 np0005588920 nova_compute[226886]: 2026-01-20 14:32:19.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:20.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:20.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.207 226890 DEBUG nova.compute.manager [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.207 226890 DEBUG nova.compute.manager [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing instance network info cache due to event network-changed-7554fcf8-fae8-4efb-aa74-e25896763129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.208 226890 DEBUG oslo_concurrency.lockutils [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:22.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:22.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.942 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.975 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.975 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.975 226890 DEBUG oslo_concurrency.lockutils [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.976 226890 DEBUG nova.network.neutron [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Refreshing network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.977 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.978 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.978 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.978 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.979 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.979 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:22 np0005588920 nova_compute[226886]: 2026-01-20 14:32:22.980 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:32:23 np0005588920 podman[240914]: 2026-01-20 14:32:23.011343008 +0000 UTC m=+0.088508426 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.212 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.761 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.761 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:32:23 np0005588920 nova_compute[226886]: 2026-01-20 14:32:23.761 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2967665820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.197 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.274 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.349 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000022 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.350 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000022 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.354 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.354 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.396 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.515 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.516 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4419MB free_disk=20.900997161865234GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.517 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.517 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:24.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:24.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.791 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ebc2b8c3-8d9f-4798-8865-dd256233f4fc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.791 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 57c49518-7381-4e7d-975f-9c6afc3ea966 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.791 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.792 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:32:24 np0005588920 nova_compute[226886]: 2026-01-20 14:32:24.864 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/157752220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.333 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.341 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.414 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.434 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.435 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.583 226890 DEBUG nova.network.neutron [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updated VIF entry in instance network info cache for port 7554fcf8-fae8-4efb-aa74-e25896763129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.583 226890 DEBUG nova.network.neutron [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [{"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:25 np0005588920 nova_compute[226886]: 2026-01-20 14:32:25.600 226890 DEBUG oslo_concurrency.lockutils [req-7d95e8f8-67fd-41e3-b2b7-3fd60c3a824b req-9bb62234-abf4-490c-9f6a-7898e04ea387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-57c49518-7381-4e7d-975f-9c6afc3ea966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:26.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:26 np0005588920 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 20 09:32:26 np0005588920 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Consumed 13.079s CPU time.
Jan 20 09:32:26 np0005588920 systemd-machined[196121]: Machine qemu-16-instance-00000022 terminated.
Jan 20 09:32:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.290 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.297 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance destroyed successfully.#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.305 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance destroyed successfully.#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.670 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.671 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.671 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.672 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.672 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.674 226890 INFO nova.compute.manager [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Terminating instance#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.675 226890 DEBUG nova.compute.manager [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:32:27 np0005588920 kernel: tap7554fcf8-fa (unregistering): left promiscuous mode
Jan 20 09:32:27 np0005588920 NetworkManager[49076]: <info>  [1768919547.7332] device (tap7554fcf8-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.747 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:27Z|00099|binding|INFO|Releasing lport 7554fcf8-fae8-4efb-aa74-e25896763129 from this chassis (sb_readonly=0)
Jan 20 09:32:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:27Z|00100|binding|INFO|Setting lport 7554fcf8-fae8-4efb-aa74-e25896763129 down in Southbound
Jan 20 09:32:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:32:27Z|00101|binding|INFO|Removing iface tap7554fcf8-fa ovn-installed in OVS
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.751 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:27.761 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:31:c5 10.100.0.9'], port_security=['fa:16:3e:1d:31:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '57c49518-7381-4e7d-975f-9c6afc3ea966', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42267411-7489-4be1-bea9-a5a8c37215df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac0564cb541e4c679c6d282fd454c05d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18113a43-33ad-4733-bb67-ffdaa13f93ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9d97f7f-5f3a-4a28-99e4-dc5cb30f9bdc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7554fcf8-fae8-4efb-aa74-e25896763129) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:32:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:27.764 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7554fcf8-fae8-4efb-aa74-e25896763129 in datapath 42267411-7489-4be1-bea9-a5a8c37215df unbound from our chassis#033[00m
Jan 20 09:32:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:27.766 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42267411-7489-4be1-bea9-a5a8c37215df, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:32:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:27.768 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e86e25bc-b272-4ecf-a16a-f3ae115623b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:27.769 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df namespace which is not needed anymore#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 20 09:32:27 np0005588920 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Consumed 13.796s CPU time.
Jan 20 09:32:27 np0005588920 systemd-machined[196121]: Machine qemu-15-instance-00000023 terminated.
Jan 20 09:32:27 np0005588920 NetworkManager[49076]: <info>  [1768919547.8934] manager: (tap7554fcf8-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.894 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.913 226890 INFO nova.virt.libvirt.driver [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Instance destroyed successfully.#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.914 226890 DEBUG nova.objects.instance [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lazy-loading 'resources' on Instance uuid 57c49518-7381-4e7d-975f-9c6afc3ea966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.931 226890 DEBUG nova.virt.libvirt.vif [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:31:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-617899621',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-617899621',id=35,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:31:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac0564cb541e4c679c6d282fd454c05d',ramdisk_id='',reservation_id='r-crhbpp8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1763144881-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:31:55Z,user_data=None,user_id='dd0483da65fa4225846c0cc91e8e0275',uuid=57c49518-7381-4e7d-975f-9c6afc3ea966,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.932 226890 DEBUG nova.network.os_vif_util [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converting VIF {"id": "7554fcf8-fae8-4efb-aa74-e25896763129", "address": "fa:16:3e:1d:31:c5", "network": {"id": "42267411-7489-4be1-bea9-a5a8c37215df", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1261387871-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac0564cb541e4c679c6d282fd454c05d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7554fcf8-fa", "ovs_interfaceid": "7554fcf8-fae8-4efb-aa74-e25896763129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.933 226890 DEBUG nova.network.os_vif_util [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.934 226890 DEBUG os_vif [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.937 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7554fcf8-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.938 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.939 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:27 np0005588920 nova_compute[226886]: 2026-01-20 14:32:27.944 226890 INFO os_vif [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:31:c5,bridge_name='br-int',has_traffic_filtering=True,id=7554fcf8-fae8-4efb-aa74-e25896763129,network=Network(42267411-7489-4be1-bea9-a5a8c37215df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7554fcf8-fa')#033[00m
Jan 20 09:32:27 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [NOTICE]   (240497) : haproxy version is 2.8.14-c23fe91
Jan 20 09:32:27 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [NOTICE]   (240497) : path to executable is /usr/sbin/haproxy
Jan 20 09:32:27 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [WARNING]  (240497) : Exiting Master process...
Jan 20 09:32:27 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [ALERT]    (240497) : Current worker (240499) exited with code 143 (Terminated)
Jan 20 09:32:27 np0005588920 neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df[240493]: [WARNING]  (240497) : All workers exited. Exiting... (0)
Jan 20 09:32:27 np0005588920 systemd[1]: libpod-f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6.scope: Deactivated successfully.
Jan 20 09:32:27 np0005588920 podman[241025]: 2026-01-20 14:32:27.960627638 +0000 UTC m=+0.056499261 container died f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:32:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay-062b87c98f50a6083a70429521c4fae820e3d7a1004760c0a37db60ffd08a9ee-merged.mount: Deactivated successfully.
Jan 20 09:32:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6-userdata-shm.mount: Deactivated successfully.
Jan 20 09:32:28 np0005588920 podman[241025]: 2026-01-20 14:32:28.012222715 +0000 UTC m=+0.108094298 container cleanup f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:32:28 np0005588920 systemd[1]: libpod-conmon-f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6.scope: Deactivated successfully.
Jan 20 09:32:28 np0005588920 podman[241083]: 2026-01-20 14:32:28.080312844 +0000 UTC m=+0.046065807 container remove f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.086 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fd68f69f-2b85-49a6-916a-4753e4b18a94]: (4, ('Tue Jan 20 02:32:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df (f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6)\nf0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6\nTue Jan 20 02:32:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df (f0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6)\nf0658c630f6219cdd195e9abf6d8a8af596b386aa757da8e85ee60e2dc8786d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.088 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5500e16e-4c33-4748-9877-6d566c6cab3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.089 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42267411-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:28 np0005588920 kernel: tap42267411-70: left promiscuous mode
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.091 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.094 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[802a442a-98c7-4c5b-9039-090ff76f1832]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.120 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.122 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc5aacc-b99b-479f-810f-300d10f54277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.123 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc9abb4-7430-49bf-bc1a-1943b7a89774]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.136 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[186aebac-3402-479c-b45e-56949bd39ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455614, 'reachable_time': 31787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241099, 'error': None, 'target': 'ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.138 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42267411-7489-4be1-bea9-a5a8c37215df deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:32:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:28.139 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[751d3964-392c-49c6-afd7-41f28c6ebeb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:32:28 np0005588920 systemd[1]: run-netns-ovnmeta\x2d42267411\x2d7489\x2d4be1\x2dbea9\x2da5a8c37215df.mount: Deactivated successfully.
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.172 226890 DEBUG nova.compute.manager [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-unplugged-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.173 226890 DEBUG oslo_concurrency.lockutils [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.173 226890 DEBUG oslo_concurrency.lockutils [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.173 226890 DEBUG oslo_concurrency.lockutils [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.174 226890 DEBUG nova.compute.manager [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] No waiting events found dispatching network-vif-unplugged-7554fcf8-fae8-4efb-aa74-e25896763129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.174 226890 DEBUG nova.compute.manager [req-23b1433e-cad0-45cc-b268-b28180e46d98 req-2402f498-a3cc-4214-8821-f5a2ad2dbae4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-unplugged-7554fcf8-fae8-4efb-aa74-e25896763129 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.400 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting instance files /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.401 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deletion of /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del complete#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.462 226890 INFO nova.virt.libvirt.driver [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Deleting instance files /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966_del#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.463 226890 INFO nova.virt.libvirt.driver [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Deletion of /var/lib/nova/instances/57c49518-7381-4e7d-975f-9c6afc3ea966_del complete#033[00m
Jan 20 09:32:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:28.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.591 226890 INFO nova.compute.manager [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.591 226890 DEBUG oslo.service.loopingcall [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.592 226890 DEBUG nova.compute.manager [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.592 226890 DEBUG nova.network.neutron [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.620 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.620 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating image(s)#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.651 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.684 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.714 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.719 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:28.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.803 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.804 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.805 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.805 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.831 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:28 np0005588920 nova_compute[226886]: 2026-01-20 14:32:28.834 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.175 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.266 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] resizing rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.353 226890 DEBUG nova.network.neutron [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.359 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.360 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Ensure instance console log exists: /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.360 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.360 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.360 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.362 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.366 226890 WARNING nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.371 226890 DEBUG nova.virt.libvirt.host [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.371 226890 DEBUG nova.virt.libvirt.host [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.374 226890 DEBUG nova.virt.libvirt.host [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.374 226890 DEBUG nova.virt.libvirt.host [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.375 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.375 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.376 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.377 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.377 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.377 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.377 226890 DEBUG nova.virt.hardware [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.377 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'vcpu_model' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.398 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.458 226890 INFO nova.compute.manager [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.470 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.511 226890 DEBUG nova.compute.manager [req-f0caa6f3-42bb-47d7-823d-3a487479520e req-fe6c61c7-7008-4090-8807-9255ec7d8b15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-deleted-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.514 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.514 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.586 226890 DEBUG oslo_concurrency.processutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4060471299' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.975 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:29 np0005588920 nova_compute[226886]: 2026-01-20 14:32:29.998 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/115828176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.002 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.021 226890 DEBUG oslo_concurrency.processutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.028 226890 DEBUG nova.compute.provider_tree [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.076 226890 DEBUG nova.scheduler.client.report [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.103 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.178 226890 INFO nova.scheduler.client.report [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Deleted allocations for instance 57c49518-7381-4e7d-975f-9c6afc3ea966#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.244 226890 DEBUG oslo_concurrency.lockutils [None req-4592fd8a-a7cb-4a68-86b8-e2eef47a80ae dd0483da65fa4225846c0cc91e8e0275 ac0564cb541e4c679c6d282fd454c05d - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/830453568' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.420 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.424 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <uuid>ebc2b8c3-8d9f-4798-8865-dd256233f4fc</uuid>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <name>instance-00000022</name>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAdmin275Test-server-625178373</nova:name>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:32:29</nova:creationTime>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:user uuid="3d51de2ad98d40d8ad12305518d106fd">tempest-ServersAdmin275Test-1570927802-project-member</nova:user>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <nova:project uuid="389fafaa99e14f31988005de907401bf">tempest-ServersAdmin275Test-1570927802</nova:project>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="serial">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="uuid">ebc2b8c3-8d9f-4798-8865-dd256233f4fc</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/console.log" append="off"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:32:30 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:32:30 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:32:30 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:32:30 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:32:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:30.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.622 226890 DEBUG nova.compute.manager [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.622 226890 DEBUG oslo_concurrency.lockutils [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.623 226890 DEBUG oslo_concurrency.lockutils [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.623 226890 DEBUG oslo_concurrency.lockutils [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "57c49518-7381-4e7d-975f-9c6afc3ea966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.624 226890 DEBUG nova.compute.manager [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] No waiting events found dispatching network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.624 226890 WARNING nova.compute.manager [req-63bf758e-413c-414b-8770-8663aad773ea req-48d63fae-3f78-4659-8b42-18f90e2585cc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Received unexpected event network-vif-plugged-7554fcf8-fae8-4efb-aa74-e25896763129 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.662 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.662 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.663 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Using config drive#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.691 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.713 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'ec2_ids' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:30 np0005588920 nova_compute[226886]: 2026-01-20 14:32:30.761 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lazy-loading 'keypairs' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:30.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.045 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Creating config drive at /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.054 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2b9mby0n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.200 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2b9mby0n" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.241 226890 DEBUG nova.storage.rbd_utils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] rbd image ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.245 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.419 226890 DEBUG oslo_concurrency.processutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config ebc2b8c3-8d9f-4798-8865-dd256233f4fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:31 np0005588920 nova_compute[226886]: 2026-01-20 14:32:31.420 226890 INFO nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting local config drive /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc/disk.config because it was imported into RBD.#033[00m
Jan 20 09:32:31 np0005588920 systemd-machined[196121]: New machine qemu-17-instance-00000022.
Jan 20 09:32:31 np0005588920 systemd[1]: Started Virtual Machine qemu-17-instance-00000022.
Jan 20 09:32:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:32.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:32:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.724 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for ebc2b8c3-8d9f-4798-8865-dd256233f4fc due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.725 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919552.7225285, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.726 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.730 226890 DEBUG nova.compute.manager [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.731 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.736 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance spawned successfully.#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.737 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:32:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:32.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.771 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.777 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.779 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.780 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.780 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.780 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.781 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.781 226890 DEBUG nova.virt.libvirt.driver [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.819 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.820 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919552.7231576, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.820 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Started (Lifecycle Event)#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.840 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.844 226890 DEBUG nova.compute.manager [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.845 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.881 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.923 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.924 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.924 226890 DEBUG nova.objects.instance [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.982 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:32 np0005588920 nova_compute[226886]: 2026-01-20 14:32:32.995 226890 DEBUG oslo_concurrency.lockutils [None req-4eea15c1-bdd8-4fed-b013-caf545a2db90 e5b1a5dfe695421db1917f321a1160e5 3760012f1c3540f68c35e3c53047394f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:34 np0005588920 nova_compute[226886]: 2026-01-20 14:32:34.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:34.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:34.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.132 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.133 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.133 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.134 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.134 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.136 226890 INFO nova.compute.manager [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Terminating instance#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.137 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "refresh_cache-ebc2b8c3-8d9f-4798-8865-dd256233f4fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.138 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquired lock "refresh_cache-ebc2b8c3-8d9f-4798-8865-dd256233f4fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.138 226890 DEBUG nova.network.neutron [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.338 226890 DEBUG nova.network.neutron [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.677 226890 DEBUG nova.network.neutron [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.707 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Releasing lock "refresh_cache-ebc2b8c3-8d9f-4798-8865-dd256233f4fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:35 np0005588920 nova_compute[226886]: 2026-01-20 14:32:35.708 226890 DEBUG nova.compute.manager [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:32:35 np0005588920 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 20 09:32:35 np0005588920 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000022.scope: Consumed 4.343s CPU time.
Jan 20 09:32:35 np0005588920 systemd-machined[196121]: Machine qemu-17-instance-00000022 terminated.
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.006 226890 INFO nova.virt.libvirt.driver [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance destroyed successfully.#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.006 226890 DEBUG nova.objects.instance [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lazy-loading 'resources' on Instance uuid ebc2b8c3-8d9f-4798-8865-dd256233f4fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.318 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.470 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.484 226890 INFO nova.virt.libvirt.driver [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deleting instance files /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.484 226890 INFO nova.virt.libvirt.driver [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deletion of /var/lib/nova/instances/ebc2b8c3-8d9f-4798-8865-dd256233f4fc_del complete#033[00m
Jan 20 09:32:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:36.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.570 226890 INFO nova.compute.manager [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.571 226890 DEBUG oslo.service.loopingcall [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.571 226890 DEBUG nova.compute.manager [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.571 226890 DEBUG nova.network.neutron [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.752 226890 DEBUG nova.network.neutron [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:36.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.774 226890 DEBUG nova.network.neutron [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.799 226890 INFO nova.compute.manager [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Took 0.23 seconds to deallocate network for instance.#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.853 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.853 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:36 np0005588920 nova_compute[226886]: 2026-01-20 14:32:36.928 226890 DEBUG oslo_concurrency.processutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4194897972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.362 226890 DEBUG oslo_concurrency.processutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.368 226890 DEBUG nova.compute.provider_tree [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.485 226890 DEBUG nova.scheduler.client.report [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.576 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.608 226890 INFO nova.scheduler.client.report [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Deleted allocations for instance ebc2b8c3-8d9f-4798-8865-dd256233f4fc#033[00m
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.813 226890 DEBUG oslo_concurrency.lockutils [None req-0c432b3f-67f9-4e35-90a7-bc7440a4cb6d 3d51de2ad98d40d8ad12305518d106fd 389fafaa99e14f31988005de907401bf - - default default] Lock "ebc2b8c3-8d9f-4798-8865-dd256233f4fc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:37 np0005588920 nova_compute[226886]: 2026-01-20 14:32:37.985 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:38.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:38.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:39 np0005588920 nova_compute[226886]: 2026-01-20 14:32:39.401 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:40 np0005588920 podman[241693]: 2026-01-20 14:32:40.039047655 +0000 UTC m=+0.124438605 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Jan 20 09:32:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:40.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:40.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.581723) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561581902, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2489, "num_deletes": 505, "total_data_size": 5062187, "memory_usage": 5140984, "flush_reason": "Manual Compaction"}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561610777, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2936030, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28406, "largest_seqno": 30890, "table_properties": {"data_size": 2926940, "index_size": 5008, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 24239, "raw_average_key_size": 20, "raw_value_size": 2906012, "raw_average_value_size": 2448, "num_data_blocks": 218, "num_entries": 1187, "num_filter_entries": 1187, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919386, "oldest_key_time": 1768919386, "file_creation_time": 1768919561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 29101 microseconds, and 14067 cpu microseconds.
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.610847) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2936030 bytes OK
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.610872) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.612630) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.612653) EVENT_LOG_v1 {"time_micros": 1768919561612645, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.612675) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5050139, prev total WAL file size 5050139, number of live WAL files 2.
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.614669) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2867KB)], [57(10MB)]
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561614746, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 13915897, "oldest_snapshot_seqno": -1}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5444 keys, 8452207 bytes, temperature: kUnknown
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561706480, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 8452207, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8416611, "index_size": 20894, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139173, "raw_average_key_size": 25, "raw_value_size": 8319226, "raw_average_value_size": 1528, "num_data_blocks": 843, "num_entries": 5444, "num_filter_entries": 5444, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.706727) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 8452207 bytes
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.708124) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.6 rd, 92.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.8, 10.5 +0.0 blob) out(8.1 +0.0 blob), read-write-amplify(7.6) write-amplify(2.9) OK, records in: 6451, records dropped: 1007 output_compression: NoCompression
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.708141) EVENT_LOG_v1 {"time_micros": 1768919561708133, "job": 34, "event": "compaction_finished", "compaction_time_micros": 91803, "compaction_time_cpu_micros": 19888, "output_level": 6, "num_output_files": 1, "total_output_size": 8452207, "num_input_records": 6451, "num_output_records": 5444, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561708754, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561710567, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.614531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:42.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:42.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:42 np0005588920 nova_compute[226886]: 2026-01-20 14:32:42.910 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919547.9093044, 57c49518-7381-4e7d-975f-9c6afc3ea966 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:42 np0005588920 nova_compute[226886]: 2026-01-20 14:32:42.911 226890 INFO nova.compute.manager [-] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:32:42 np0005588920 nova_compute[226886]: 2026-01-20 14:32:42.930 226890 DEBUG nova.compute.manager [None req-8122da06-4886-4585-b3ca-b641c71d922c - - - - - -] [instance: 57c49518-7381-4e7d-975f-9c6afc3ea966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:42 np0005588920 nova_compute[226886]: 2026-01-20 14:32:42.987 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:44 np0005588920 nova_compute[226886]: 2026-01-20 14:32:44.403 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:44.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:44.702 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:32:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:44.702 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:32:44 np0005588920 nova_compute[226886]: 2026-01-20 14:32:44.703 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:32:44.703 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:44.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:46.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:46.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:47 np0005588920 nova_compute[226886]: 2026-01-20 14:32:47.990 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:48.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:49 np0005588920 nova_compute[226886]: 2026-01-20 14:32:49.405 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:50.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:50.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:51 np0005588920 nova_compute[226886]: 2026-01-20 14:32:51.005 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919556.0042279, ebc2b8c3-8d9f-4798-8865-dd256233f4fc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:51 np0005588920 nova_compute[226886]: 2026-01-20 14:32:51.006 226890 INFO nova.compute.manager [-] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:32:51 np0005588920 nova_compute[226886]: 2026-01-20 14:32:51.038 226890 DEBUG nova.compute.manager [None req-4dac1d0b-c9c5-478b-a1fa-faa04075d82d - - - - - -] [instance: ebc2b8c3-8d9f-4798-8865-dd256233f4fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:52.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:52.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:52 np0005588920 nova_compute[226886]: 2026-01-20 14:32:52.993 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:53 np0005588920 podman[241720]: 2026-01-20 14:32:53.984245114 +0000 UTC m=+0.069063919 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:32:54 np0005588920 nova_compute[226886]: 2026-01-20 14:32:54.407 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:54.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:54.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:56.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:56.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:57 np0005588920 nova_compute[226886]: 2026-01-20 14:32:57.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:32:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:58.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:32:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:32:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:58.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:59 np0005588920 nova_compute[226886]: 2026-01-20 14:32:59.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:00.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:00.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:02.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:02.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:03 np0005588920 nova_compute[226886]: 2026-01-20 14:33:02.999 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:04 np0005588920 nova_compute[226886]: 2026-01-20 14:33:04.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:04.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:04.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:06.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:06.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:08 np0005588920 nova_compute[226886]: 2026-01-20 14:33:08.002 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:08.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:08.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:09 np0005588920 nova_compute[226886]: 2026-01-20 14:33:09.412 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:10.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:10.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:11 np0005588920 podman[241740]: 2026-01-20 14:33:11.066182335 +0000 UTC m=+0.141712980 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller)
Jan 20 09:33:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:12.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:12.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:13 np0005588920 nova_compute[226886]: 2026-01-20 14:33:13.004 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:33:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2650109795' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:33:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:33:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2650109795' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:33:14 np0005588920 nova_compute[226886]: 2026-01-20 14:33:14.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:14.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:14.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:16.435 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:16.435 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:16.435 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:16.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:16.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:18 np0005588920 nova_compute[226886]: 2026-01-20 14:33:18.007 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:18.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:18.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.417 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.435 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.760 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.760 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.760 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:19 np0005588920 nova_compute[226886]: 2026-01-20 14:33:19.760 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:33:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:20.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 20 09:33:20 np0005588920 nova_compute[226886]: 2026-01-20 14:33:20.739 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:20.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:21 np0005588920 nova_compute[226886]: 2026-01-20 14:33:21.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:22.170 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:33:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:22.171 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:33:22 np0005588920 nova_compute[226886]: 2026-01-20 14:33:22.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:22.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:22 np0005588920 nova_compute[226886]: 2026-01-20 14:33:22.749 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:22 np0005588920 nova_compute[226886]: 2026-01-20 14:33:22.749 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:22 np0005588920 nova_compute[226886]: 2026-01-20 14:33:22.750 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:33:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:22.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:23 np0005588920 nova_compute[226886]: 2026-01-20 14:33:23.009 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:23 np0005588920 nova_compute[226886]: 2026-01-20 14:33:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:23 np0005588920 nova_compute[226886]: 2026-01-20 14:33:23.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:23 np0005588920 nova_compute[226886]: 2026-01-20 14:33:23.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:33:23 np0005588920 nova_compute[226886]: 2026-01-20 14:33:23.756 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.419 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:24.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.755 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.756 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.781 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.782 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.783 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.783 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:33:24 np0005588920 nova_compute[226886]: 2026-01-20 14:33:24.784 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 20 09:33:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:24.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:24 np0005588920 podman[241767]: 2026-01-20 14:33:24.953187474 +0000 UTC m=+0.041660408 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 09:33:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:25.173 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112737495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.228 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.414 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.416 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4782MB free_disk=20.94301986694336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.416 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.416 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.651 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.652 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:33:25 np0005588920 nova_compute[226886]: 2026-01-20 14:33:25.988 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:33:26 np0005588920 nova_compute[226886]: 2026-01-20 14:33:26.332 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:33:26 np0005588920 nova_compute[226886]: 2026-01-20 14:33:26.333 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:33:26 np0005588920 nova_compute[226886]: 2026-01-20 14:33:26.442 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:33:26 np0005588920 nova_compute[226886]: 2026-01-20 14:33:26.488 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:33:26 np0005588920 nova_compute[226886]: 2026-01-20 14:33:26.507 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:26.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:26.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1012833942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:27 np0005588920 nova_compute[226886]: 2026-01-20 14:33:27.022 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:27 np0005588920 nova_compute[226886]: 2026-01-20 14:33:27.028 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:33:27 np0005588920 nova_compute[226886]: 2026-01-20 14:33:27.049 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:33:27 np0005588920 nova_compute[226886]: 2026-01-20 14:33:27.090 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:33:27 np0005588920 nova_compute[226886]: 2026-01-20 14:33:27.090 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:28 np0005588920 nova_compute[226886]: 2026-01-20 14:33:28.012 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:28.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:28.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:29 np0005588920 nova_compute[226886]: 2026-01-20 14:33:29.421 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:30.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:30.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 20 09:33:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:32.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:32.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:33 np0005588920 nova_compute[226886]: 2026-01-20 14:33:33.015 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:34 np0005588920 nova_compute[226886]: 2026-01-20 14:33:34.423 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:34.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:35 np0005588920 nova_compute[226886]: 2026-01-20 14:33:35.915 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:35 np0005588920 nova_compute[226886]: 2026-01-20 14:33:35.915 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:35 np0005588920 nova_compute[226886]: 2026-01-20 14:33:35.949 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.060 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.060 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.071 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.072 226890 INFO nova.compute.claims [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.236 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:36.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2118206802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.721 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.728 226890 DEBUG nova.compute.provider_tree [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.749 226890 DEBUG nova.scheduler.client.report [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.799 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.800 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:33:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.891 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.892 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.914 226890 INFO nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:33:36 np0005588920 nova_compute[226886]: 2026-01-20 14:33:36.934 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.046 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.048 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.048 226890 INFO nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Creating image(s)#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.082 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.115 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.147 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.152 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.229 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.230 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.230 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.230 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.256 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.260 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c9a86cb2-b092-4887-b47d-1a05fb756a83_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:37 np0005588920 nova_compute[226886]: 2026-01-20 14:33:37.366 226890 DEBUG nova.policy [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f51c395107c84dbd9067113b84ff01dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a841e7a1434c488390475174e10bc161', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.347 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c9a86cb2-b092-4887-b47d-1a05fb756a83_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.419 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] resizing rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:33:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:38.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.679 226890 DEBUG nova.objects.instance [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'migration_context' on Instance uuid c9a86cb2-b092-4887-b47d-1a05fb756a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.697 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.697 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Ensure instance console log exists: /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.698 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.698 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:38 np0005588920 nova_compute[226886]: 2026-01-20 14:33:38.698 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:38.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:39 np0005588920 nova_compute[226886]: 2026-01-20 14:33:39.159 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Successfully created port: 7a9a2efa-73d4-41be-92ee-61654388a2b1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:33:39 np0005588920 nova_compute[226886]: 2026-01-20 14:33:39.426 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:39Z|00102|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Jan 20 09:33:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:33:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.427 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Successfully updated port: 7a9a2efa-73d4-41be-92ee-61654388a2b1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.442 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.442 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquired lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.443 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.572 226890 DEBUG nova.compute.manager [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-changed-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.573 226890 DEBUG nova.compute.manager [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Refreshing instance network info cache due to event network-changed-7a9a2efa-73d4-41be-92ee-61654388a2b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.574 226890 DEBUG oslo_concurrency.lockutils [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:33:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:40.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:40 np0005588920 nova_compute[226886]: 2026-01-20 14:33:40.672 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:33:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:40.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:42 np0005588920 podman[242150]: 2026-01-20 14:33:42.00749678 +0000 UTC m=+0.097536290 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.100 226890 DEBUG nova.network.neutron [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updating instance_info_cache with network_info: [{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.139 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Releasing lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.140 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Instance network_info: |[{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.140 226890 DEBUG oslo_concurrency.lockutils [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.141 226890 DEBUG nova.network.neutron [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Refreshing network info cache for port 7a9a2efa-73d4-41be-92ee-61654388a2b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.144 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Start _get_guest_xml network_info=[{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.148 226890 WARNING nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.154 226890 DEBUG nova.virt.libvirt.host [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.154 226890 DEBUG nova.virt.libvirt.host [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.157 226890 DEBUG nova.virt.libvirt.host [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.157 226890 DEBUG nova.virt.libvirt.host [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.158 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.158 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.159 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.159 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.159 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.160 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.160 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.160 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.160 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.161 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.161 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.161 226890 DEBUG nova.virt.hardware [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.164 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:33:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1412916936' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:33:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:33:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:42.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:33:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:42.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.887 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.724s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.924 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:42 np0005588920 nova_compute[226886]: 2026-01-20 14:33:42.928 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:33:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2864710443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.338 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.340 226890 DEBUG nova.virt.libvirt.vif [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2105937994',display_name='tempest-ServersAdminTestJSON-server-2105937994',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2105937994',id=40,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-1bk8rulu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:33:36Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=c9a86cb2-b092-4887-b47d-1a05fb756a83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.340 226890 DEBUG nova.network.os_vif_util [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.341 226890 DEBUG nova.network.os_vif_util [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.342 226890 DEBUG nova.objects.instance [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9a86cb2-b092-4887-b47d-1a05fb756a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.369 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <uuid>c9a86cb2-b092-4887-b47d-1a05fb756a83</uuid>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <name>instance-00000028</name>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAdminTestJSON-server-2105937994</nova:name>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:33:42</nova:creationTime>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:user uuid="f51c395107c84dbd9067113b84ff01dd">tempest-ServersAdminTestJSON-1261404595-project-member</nova:user>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:project uuid="a841e7a1434c488390475174e10bc161">tempest-ServersAdminTestJSON-1261404595</nova:project>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <nova:port uuid="7a9a2efa-73d4-41be-92ee-61654388a2b1">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="serial">c9a86cb2-b092-4887-b47d-1a05fb756a83</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="uuid">c9a86cb2-b092-4887-b47d-1a05fb756a83</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c9a86cb2-b092-4887-b47d-1a05fb756a83_disk">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:67:fe:a5"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <target dev="tap7a9a2efa-73"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/console.log" append="off"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:33:43 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:33:43 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:33:43 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:33:43 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.370 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Preparing to wait for external event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.371 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.371 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.371 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.371 226890 DEBUG nova.virt.libvirt.vif [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2105937994',display_name='tempest-ServersAdminTestJSON-server-2105937994',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2105937994',id=40,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-1bk8rulu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:33:36Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=c9a86cb2-b092-4887-b47d-1a05fb756a83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.372 226890 DEBUG nova.network.os_vif_util [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.372 226890 DEBUG nova.network.os_vif_util [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.373 226890 DEBUG os_vif [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.373 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.373 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.374 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.376 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a9a2efa-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.377 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a9a2efa-73, col_values=(('external_ids', {'iface-id': '7a9a2efa-73d4-41be-92ee-61654388a2b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:fe:a5', 'vm-uuid': 'c9a86cb2-b092-4887-b47d-1a05fb756a83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:43 np0005588920 NetworkManager[49076]: <info>  [1768919623.3789] manager: (tap7a9a2efa-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.384 226890 INFO os_vif [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73')#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.451 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.451 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.451 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No VIF found with MAC fa:16:3e:67:fe:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.452 226890 INFO nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Using config drive#033[00m
Jan 20 09:33:43 np0005588920 nova_compute[226886]: 2026-01-20 14:33:43.473 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.307 226890 INFO nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Creating config drive at /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.318 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx44lvqed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.452 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx44lvqed" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.483 226890 DEBUG nova.storage.rbd_utils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.486 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.583 226890 DEBUG nova.network.neutron [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updated VIF entry in instance network info cache for port 7a9a2efa-73d4-41be-92ee-61654388a2b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.585 226890 DEBUG nova.network.neutron [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updating instance_info_cache with network_info: [{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.606 226890 DEBUG oslo_concurrency.lockutils [req-d9303cda-b225-48e5-813d-b5041ad8bde9 req-e3061832-2546-41d9-af18-d5723c694f05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:33:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000059s ======
Jan 20 09:33:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:44.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.640 226890 DEBUG oslo_concurrency.processutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config c9a86cb2-b092-4887-b47d-1a05fb756a83_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.641 226890 INFO nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Deleting local config drive /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83/disk.config because it was imported into RBD.#033[00m
Jan 20 09:33:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:44 np0005588920 kernel: tap7a9a2efa-73: entered promiscuous mode
Jan 20 09:33:44 np0005588920 NetworkManager[49076]: <info>  [1768919624.7005] manager: (tap7a9a2efa-73): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.701 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:44Z|00103|binding|INFO|Claiming lport 7a9a2efa-73d4-41be-92ee-61654388a2b1 for this chassis.
Jan 20 09:33:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:44Z|00104|binding|INFO|7a9a2efa-73d4-41be-92ee-61654388a2b1: Claiming fa:16:3e:67:fe:a5 10.100.0.4
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.707 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588920 systemd-udevd[242310]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:33:44 np0005588920 systemd-machined[196121]: New machine qemu-18-instance-00000028.
Jan 20 09:33:44 np0005588920 NetworkManager[49076]: <info>  [1768919624.7519] device (tap7a9a2efa-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:33:44 np0005588920 NetworkManager[49076]: <info>  [1768919624.7527] device (tap7a9a2efa-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588920 systemd[1]: Started Virtual Machine qemu-18-instance-00000028.
Jan 20 09:33:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:44Z|00105|binding|INFO|Setting lport 7a9a2efa-73d4-41be-92ee-61654388a2b1 ovn-installed in OVS
Jan 20 09:33:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:44Z|00106|binding|INFO|Setting lport 7a9a2efa-73d4-41be-92ee-61654388a2b1 up in Southbound
Jan 20 09:33:44 np0005588920 nova_compute[226886]: 2026-01-20 14:33:44.773 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.775 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:fe:a5 10.100.0.4'], port_security=['fa:16:3e:67:fe:a5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9a86cb2-b092-4887-b47d-1a05fb756a83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7a9a2efa-73d4-41be-92ee-61654388a2b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.776 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7a9a2efa-73d4-41be-92ee-61654388a2b1 in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a bound to our chassis#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.778 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.792 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd28f50-bdd1-4f92-9340-802cb5293105]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.793 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap33c9a20a-d1 in ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.795 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap33c9a20a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.795 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb672441-4b92-4029-9323-177d5db16eb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.796 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e536331-9ed9-4e3d-b5be-57a96d6f9f2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.806 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a2e3d961-8f54-4153-b2af-5e299ff1bdec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.830 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[47925784-0c64-438c-bde5-00f976dd292d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.866 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3f1066-16a1-426d-8528-87140dd29f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:44.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.873 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5857accd-fe97-418d-a8ac-f30d29b5885b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 NetworkManager[49076]: <info>  [1768919624.8749] manager: (tap33c9a20a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.905 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[78039300-aad1-4ef5-98d9-2981807ab7fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.910 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f7aef8-adfe-4575-82ae-01dab369a316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 NetworkManager[49076]: <info>  [1768919624.9348] device (tap33c9a20a-d0): carrier: link connected
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.941 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4342cb-219b-4951-8bf0-8808dff48f75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.960 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d61a0b-17f6-4458-a76b-ceeb5c19731e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466662, 'reachable_time': 25088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242344, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.977 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8082e237-a0ee-48a3-a0d9-148cd4522a74]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:8ebd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466662, 'tstamp': 466662}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242345, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:44.997 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[802dc702-3ab4-4be3-9d76-86eee3a6236b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466662, 'reachable_time': 25088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242346, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.029 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ff3f5b-5ddb-40c4-a79e-1b4f379488e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.091 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d67fe834-1b00-4591-b459-cafff036ebd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.092 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.093 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.093 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.095 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:45 np0005588920 NetworkManager[49076]: <info>  [1768919625.0956] manager: (tap33c9a20a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 20 09:33:45 np0005588920 kernel: tap33c9a20a-d0: entered promiscuous mode
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.099 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:45 np0005588920 ovn_controller[133971]: 2026-01-20T14:33:45Z|00107|binding|INFO|Releasing lport 90c69687-c788-4dba-881f-3ed4a5ee6007 from this chassis (sb_readonly=0)
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.122 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.123 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a080fd-3972-48c3-a274-e00f40d28e60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.124 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:33:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:33:45.125 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'env', 'PROCESS_TAG=haproxy-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.202 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919625.201869, c9a86cb2-b092-4887-b47d-1a05fb756a83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.202 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] VM Started (Lifecycle Event)#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.226 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.233 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919625.2048755, c9a86cb2-b092-4887-b47d-1a05fb756a83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.234 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.258 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.262 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:33:45 np0005588920 nova_compute[226886]: 2026-01-20 14:33:45.287 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:33:45 np0005588920 podman[242420]: 2026-01-20 14:33:45.477137665 +0000 UTC m=+0.052640829 container create 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:33:45 np0005588920 systemd[1]: Started libpod-conmon-91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811.scope.
Jan 20 09:33:45 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:33:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/805c9a9b81947f21b33ab479b627dfe2bcab8782966047413eac7a75abad9f31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:33:45 np0005588920 podman[242420]: 2026-01-20 14:33:45.448446087 +0000 UTC m=+0.023949271 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:33:45 np0005588920 podman[242420]: 2026-01-20 14:33:45.554850504 +0000 UTC m=+0.130353668 container init 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:33:45 np0005588920 podman[242420]: 2026-01-20 14:33:45.559620874 +0000 UTC m=+0.135124038 container start 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:33:45 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [NOTICE]   (242439) : New worker (242441) forked
Jan 20 09:33:45 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [NOTICE]   (242439) : Loading success.
Jan 20 09:33:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:46.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:46.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.074 226890 DEBUG nova.compute.manager [req-465c4a74-fd8a-49b3-aff9-875b2de025d4 req-976bc3e7-0539-46f8-ad61-d582e3421be8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.074 226890 DEBUG oslo_concurrency.lockutils [req-465c4a74-fd8a-49b3-aff9-875b2de025d4 req-976bc3e7-0539-46f8-ad61-d582e3421be8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.075 226890 DEBUG oslo_concurrency.lockutils [req-465c4a74-fd8a-49b3-aff9-875b2de025d4 req-976bc3e7-0539-46f8-ad61-d582e3421be8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.075 226890 DEBUG oslo_concurrency.lockutils [req-465c4a74-fd8a-49b3-aff9-875b2de025d4 req-976bc3e7-0539-46f8-ad61-d582e3421be8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.075 226890 DEBUG nova.compute.manager [req-465c4a74-fd8a-49b3-aff9-875b2de025d4 req-976bc3e7-0539-46f8-ad61-d582e3421be8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Processing event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.076 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.080 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919628.080582, c9a86cb2-b092-4887-b47d-1a05fb756a83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.081 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.082 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.086 226890 INFO nova.virt.libvirt.driver [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Instance spawned successfully.#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.087 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.108 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.109 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.109 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.109 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.110 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.110 226890 DEBUG nova.virt.libvirt.driver [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.117 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.120 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.149 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.182 226890 INFO nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Took 11.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.183 226890 DEBUG nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.296 226890 INFO nova.compute.manager [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Took 12.27 seconds to build instance.#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.319 226890 DEBUG oslo_concurrency.lockutils [None req-21e8e29f-d4ff-413e-ba69-7e534a44a030 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:48 np0005588920 nova_compute[226886]: 2026-01-20 14:33:48.379 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:48.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:48.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:49 np0005588920 nova_compute[226886]: 2026-01-20 14:33:49.430 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.196 226890 DEBUG nova.compute.manager [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.197 226890 DEBUG oslo_concurrency.lockutils [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.197 226890 DEBUG oslo_concurrency.lockutils [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.197 226890 DEBUG oslo_concurrency.lockutils [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.198 226890 DEBUG nova.compute.manager [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] No waiting events found dispatching network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:33:50 np0005588920 nova_compute[226886]: 2026-01-20 14:33:50.198 226890 WARNING nova.compute.manager [req-380d2df7-56e4-48f4-9236-f0a6a95d05f8 req-93981c71-47dd-4437-9912-b0610bc389d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received unexpected event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:33:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:50.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:50.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:52.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:52.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:53 np0005588920 nova_compute[226886]: 2026-01-20 14:33:53.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:54 np0005588920 nova_compute[226886]: 2026-01-20 14:33:54.458 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:54.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:54.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:56 np0005588920 podman[242500]: 2026-01-20 14:33:56.00960252 +0000 UTC m=+0.072851869 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:33:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:33:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:56.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:33:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:56.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:58 np0005588920 nova_compute[226886]: 2026-01-20 14:33:58.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:58.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:33:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:59 np0005588920 nova_compute[226886]: 2026-01-20 14:33:59.460 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:00.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:00.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:34:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:02.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:34:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:02.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:03Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:fe:a5 10.100.0.4
Jan 20 09:34:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:03Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:fe:a5 10.100.0.4
Jan 20 09:34:03 np0005588920 nova_compute[226886]: 2026-01-20 14:34:03.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:04 np0005588920 nova_compute[226886]: 2026-01-20 14:34:04.464 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:04.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:34:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:34:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:06.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:06.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:08 np0005588920 nova_compute[226886]: 2026-01-20 14:34:08.429 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:08.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:08.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:09 np0005588920 nova_compute[226886]: 2026-01-20 14:34:09.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:10.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:10.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:12.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:13 np0005588920 podman[242521]: 2026-01-20 14:34:13.036664264 +0000 UTC m=+0.114873442 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:34:13 np0005588920 nova_compute[226886]: 2026-01-20 14:34:13.430 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588920 nova_compute[226886]: 2026-01-20 14:34:14.468 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:14.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:14.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:16.436 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:16.438 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:16.439 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:16.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:16.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:18 np0005588920 nova_compute[226886]: 2026-01-20 14:34:18.434 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:18.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:18.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:19 np0005588920 nova_compute[226886]: 2026-01-20 14:34:19.470 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:20.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.060 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.080 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.080 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.081 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.458 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.458 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.458 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:34:21 np0005588920 nova_compute[226886]: 2026-01-20 14:34:21.459 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c9a86cb2-b092-4887-b47d-1a05fb756a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:22.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:22.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:23 np0005588920 nova_compute[226886]: 2026-01-20 14:34:23.436 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:23.630 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:23 np0005588920 nova_compute[226886]: 2026-01-20 14:34:23.631 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:23.631 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.458 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updating instance_info_cache with network_info: [{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.475 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.475 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.475 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.476 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.477 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.477 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.477 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:24.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 nova_compute[226886]: 2026-01-20 14:34:24.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:24.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.752 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:34:25 np0005588920 nova_compute[226886]: 2026-01-20 14:34:25.753 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2572931249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.263 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.328 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.329 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:26 np0005588920 podman[242570]: 2026-01-20 14:34:26.350277202 +0000 UTC m=+0.048872831 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.468 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.469 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4569MB free_disk=20.860675811767578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.470 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.470 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.579 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance c9a86cb2-b092-4887-b47d-1a05fb756a83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.580 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.581 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:34:26 np0005588920 nova_compute[226886]: 2026-01-20 14:34:26.642 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:26.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:26.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/252065068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:27 np0005588920 nova_compute[226886]: 2026-01-20 14:34:27.096 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:27 np0005588920 nova_compute[226886]: 2026-01-20 14:34:27.102 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:27 np0005588920 nova_compute[226886]: 2026-01-20 14:34:27.121 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:27 np0005588920 nova_compute[226886]: 2026-01-20 14:34:27.157 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:34:27 np0005588920 nova_compute[226886]: 2026-01-20 14:34:27.158 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:28 np0005588920 nova_compute[226886]: 2026-01-20 14:34:28.439 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:28.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:28 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 20 09:34:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:28.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:29 np0005588920 nova_compute[226886]: 2026-01-20 14:34:29.521 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:29.633 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.512 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.512 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.531 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.624 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.625 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.635 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.635 226890 INFO nova.compute.claims [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:34:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:30.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:30 np0005588920 nova_compute[226886]: 2026-01-20 14:34:30.784 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:30.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1462311559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.244 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.249 226890 DEBUG nova.compute.provider_tree [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.269 226890 DEBUG nova.scheduler.client.report [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.304 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.305 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.364 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.364 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.384 226890 INFO nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.399 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.479 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.480 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.481 226890 INFO nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Creating image(s)#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.506 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.533 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.562 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.565 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.641 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.642 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.643 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.643 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.671 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.674 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 39013d10-cf09-4fc7-826c-99746ff0eb68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:31 np0005588920 nova_compute[226886]: 2026-01-20 14:34:31.994 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 39013d10-cf09-4fc7-826c-99746ff0eb68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.034 226890 DEBUG nova.policy [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56e2959629114d3d8a48e7a80ed96c4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3750c56415134773aa9d9880038f1749', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.073 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] resizing rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.163 226890 DEBUG nova.objects.instance [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'migration_context' on Instance uuid 39013d10-cf09-4fc7-826c-99746ff0eb68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.174 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.174 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Ensure instance console log exists: /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.175 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.175 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:32 np0005588920 nova_compute[226886]: 2026-01-20 14:34:32.175 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:32.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:32.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:33 np0005588920 nova_compute[226886]: 2026-01-20 14:34:33.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:33 np0005588920 nova_compute[226886]: 2026-01-20 14:34:33.493 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Successfully created port: 3e4b13e0-655b-41b9-8274-940d2a5cdf49 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.317 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Successfully updated port: 3e4b13e0-655b-41b9-8274-940d2a5cdf49 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.345 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.346 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquired lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.346 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.406 226890 DEBUG nova.compute.manager [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-changed-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.406 226890 DEBUG nova.compute.manager [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Refreshing instance network info cache due to event network-changed-3e4b13e0-655b-41b9-8274-940d2a5cdf49. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.406 226890 DEBUG oslo_concurrency.lockutils [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.524 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:34 np0005588920 nova_compute[226886]: 2026-01-20 14:34:34.594 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:34:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:34.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.521 226890 DEBUG nova.network.neutron [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Updating instance_info_cache with network_info: [{"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.546 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Releasing lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.546 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Instance network_info: |[{"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.547 226890 DEBUG oslo_concurrency.lockutils [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.547 226890 DEBUG nova.network.neutron [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Refreshing network info cache for port 3e4b13e0-655b-41b9-8274-940d2a5cdf49 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.552 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Start _get_guest_xml network_info=[{"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.558 226890 WARNING nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.563 226890 DEBUG nova.virt.libvirt.host [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.565 226890 DEBUG nova.virt.libvirt.host [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.567 226890 DEBUG nova.virt.libvirt.host [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.568 226890 DEBUG nova.virt.libvirt.host [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.570 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.570 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.571 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.571 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.572 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.572 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.573 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.573 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.574 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.574 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.574 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.575 226890 DEBUG nova.virt.hardware [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:34:35 np0005588920 nova_compute[226886]: 2026-01-20 14:34:35.580 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366852801' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.043 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.221 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.225 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:36.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3690699125' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.820 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.595s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.822 226890 DEBUG nova.virt.libvirt.vif [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:34:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2138888348',display_name='tempest-ImagesTestJSON-server-2138888348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2138888348',id=44,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-i9x5sh3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:31Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=39013d10-cf09-4fc7-826c-99746ff0eb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.823 226890 DEBUG nova.network.os_vif_util [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.824 226890 DEBUG nova.network.os_vif_util [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.825 226890 DEBUG nova.objects.instance [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39013d10-cf09-4fc7-826c-99746ff0eb68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.861 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <uuid>39013d10-cf09-4fc7-826c-99746ff0eb68</uuid>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <name>instance-0000002c</name>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:name>tempest-ImagesTestJSON-server-2138888348</nova:name>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:34:35</nova:creationTime>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:user uuid="56e2959629114d3d8a48e7a80ed96c4b">tempest-ImagesTestJSON-338390217-project-member</nova:user>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:project uuid="3750c56415134773aa9d9880038f1749">tempest-ImagesTestJSON-338390217</nova:project>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <nova:port uuid="3e4b13e0-655b-41b9-8274-940d2a5cdf49">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="serial">39013d10-cf09-4fc7-826c-99746ff0eb68</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="uuid">39013d10-cf09-4fc7-826c-99746ff0eb68</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/39013d10-cf09-4fc7-826c-99746ff0eb68_disk">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:50:d7:49"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <target dev="tap3e4b13e0-65"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/console.log" append="off"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:34:36 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:34:36 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:34:36 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:34:36 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.862 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Preparing to wait for external event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.863 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.863 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.863 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.864 226890 DEBUG nova.virt.libvirt.vif [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:34:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2138888348',display_name='tempest-ImagesTestJSON-server-2138888348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2138888348',id=44,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-i9x5sh3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:31Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=39013d10-cf09-4fc7-826c-99746ff0eb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.864 226890 DEBUG nova.network.os_vif_util [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.865 226890 DEBUG nova.network.os_vif_util [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.865 226890 DEBUG os_vif [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.866 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.867 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.867 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.871 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e4b13e0-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.871 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e4b13e0-65, col_values=(('external_ids', {'iface-id': '3e4b13e0-655b-41b9-8274-940d2a5cdf49', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:d7:49', 'vm-uuid': '39013d10-cf09-4fc7-826c-99746ff0eb68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:36 np0005588920 NetworkManager[49076]: <info>  [1768919676.8741] manager: (tap3e4b13e0-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.877 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.880 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.881 226890 INFO os_vif [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65')#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.942 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.943 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.943 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No VIF found with MAC fa:16:3e:50:d7:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.943 226890 INFO nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Using config drive#033[00m
Jan 20 09:34:36 np0005588920 nova_compute[226886]: 2026-01-20 14:34:36.965 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:36.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.381 226890 INFO nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Creating config drive at /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.388 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrqrbjcr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.518 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrqrbjcr" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.540 226890 DEBUG nova.storage.rbd_utils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.543 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config 39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.777 226890 DEBUG oslo_concurrency.processutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config 39013d10-cf09-4fc7-826c-99746ff0eb68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.778 226890 INFO nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Deleting local config drive /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68/disk.config because it was imported into RBD.#033[00m
Jan 20 09:34:37 np0005588920 kernel: tap3e4b13e0-65: entered promiscuous mode
Jan 20 09:34:37 np0005588920 NetworkManager[49076]: <info>  [1768919677.8399] manager: (tap3e4b13e0-65): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 20 09:34:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:37Z|00108|binding|INFO|Claiming lport 3e4b13e0-655b-41b9-8274-940d2a5cdf49 for this chassis.
Jan 20 09:34:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:37Z|00109|binding|INFO|3e4b13e0-655b-41b9-8274-940d2a5cdf49: Claiming fa:16:3e:50:d7:49 10.100.0.12
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.844 226890 DEBUG nova.network.neutron [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Updated VIF entry in instance network info cache for port 3e4b13e0-655b-41b9-8274-940d2a5cdf49. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.844 226890 DEBUG nova.network.neutron [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Updating instance_info_cache with network_info: [{"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.866 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:d7:49 10.100.0.12'], port_security=['fa:16:3e:50:d7:49 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39013d10-cf09-4fc7-826c-99746ff0eb68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=3e4b13e0-655b-41b9-8274-940d2a5cdf49) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:37 np0005588920 systemd-udevd[242934]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.869 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 3e4b13e0-655b-41b9-8274-940d2a5cdf49 in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a bound to our chassis#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.871 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network abb83e3e-0b12-431b-ad86-a1d271b5b46a#033[00m
Jan 20 09:34:37 np0005588920 systemd-machined[196121]: New machine qemu-19-instance-0000002c.
Jan 20 09:34:37 np0005588920 NetworkManager[49076]: <info>  [1768919677.8819] device (tap3e4b13e0-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:34:37 np0005588920 NetworkManager[49076]: <info>  [1768919677.8825] device (tap3e4b13e0-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.888 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8395dae4-6526-4e2d-98b6-122029d83a51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.889 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapabb83e3e-01 in ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.892 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapabb83e3e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.892 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[851ce31b-87f4-43bf-bbe9-23c90059826f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.893 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2313d121-cd3a-475d-aea7-d6b5c64cf373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 systemd[1]: Started Virtual Machine qemu-19-instance-0000002c.
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.897 226890 DEBUG oslo_concurrency.lockutils [req-9da3b198-e953-42a2-bd14-48dd38157055 req-e02e6539-ab44-42cb-80e0-9c05b88ae051 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-39013d10-cf09-4fc7-826c-99746ff0eb68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.903 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:37Z|00110|binding|INFO|Setting lport 3e4b13e0-655b-41b9-8274-940d2a5cdf49 ovn-installed in OVS
Jan 20 09:34:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:37Z|00111|binding|INFO|Setting lport 3e4b13e0-655b-41b9-8274-940d2a5cdf49 up in Southbound
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.907 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7720f3-0e29-49ea-8613-c1b3e3abc00a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 nova_compute[226886]: 2026-01-20 14:34:37.911 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.922 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f72ed9e9-f09c-4abc-80e2-18e99d47ab7d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.955 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8c86612d-be25-4f93-afde-cc73a89f0c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 NetworkManager[49076]: <info>  [1768919677.9616] manager: (tapabb83e3e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.960 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f140de9-4c25-40d7-82c0-87d5b51a0884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 systemd-udevd[242938]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.991 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[755df3a8-43df-498c-8055-0d9e39b97c0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:37.993 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4d70ac12-3358-4e1a-99a3-231f121dfdda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 NetworkManager[49076]: <info>  [1768919678.0173] device (tapabb83e3e-00): carrier: link connected
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.023 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e6b366-169a-4ada-8beb-e5c8c7c6c43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.043 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce89a76-bc42-4b3f-85ee-9b4b7d2980e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471970, 'reachable_time': 20241, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242968, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.057 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe74089-7362-43d0-85a9-c644ef0b3b2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 471970, 'tstamp': 471970}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242969, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eddbf3dd-9366-451c-82d9-5359ee3f8ad9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471970, 'reachable_time': 20241, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242970, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.111 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d91585-e635-4185-8374-8e1855703faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.181 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[66b7e257-0c8a-4d64-9365-8ec1de68eabd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.182 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.183 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.184 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabb83e3e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:38 np0005588920 kernel: tapabb83e3e-00: entered promiscuous mode
Jan 20 09:34:38 np0005588920 NetworkManager[49076]: <info>  [1768919678.1869] manager: (tapabb83e3e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.185 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.189 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapabb83e3e-00, col_values=(('external_ids', {'iface-id': 'dfacaf19-f896-4c13-a7ad-47b57cf03fc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:38 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:38Z|00112|binding|INFO|Releasing lport dfacaf19-f896-4c13-a7ad-47b57cf03fc1 from this chassis (sb_readonly=0)
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.190 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.203 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.205 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.205 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9169f654-42b9-47a3-bd1e-b202a264ba9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.206 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:34:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:38.208 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'env', 'PROCESS_TAG=haproxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/abb83e3e-0b12-431b-ad86-a1d271b5b46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:34:38 np0005588920 podman[243035]: 2026-01-20 14:34:38.548882101 +0000 UTC m=+0.046326690 container create b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:34:38 np0005588920 systemd[1]: Started libpod-conmon-b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c.scope.
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.607 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919678.6065114, 39013d10-cf09-4fc7-826c-99746ff0eb68 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.608 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] VM Started (Lifecycle Event)#033[00m
Jan 20 09:34:38 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:34:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4c1078d1353a0ef32c417ac6af2685511647dcabb3cadf43a782184d82523ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:34:38 np0005588920 podman[243035]: 2026-01-20 14:34:38.523719455 +0000 UTC m=+0.021164064 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:34:38 np0005588920 podman[243035]: 2026-01-20 14:34:38.628711099 +0000 UTC m=+0.126155708 container init b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.636 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:38 np0005588920 podman[243035]: 2026-01-20 14:34:38.639991155 +0000 UTC m=+0.137435784 container start b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.640 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919678.6068, 39013d10-cf09-4fc7-826c-99746ff0eb68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.640 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:34:38 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [NOTICE]   (243063) : New worker (243065) forked
Jan 20 09:34:38 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [NOTICE]   (243063) : Loading success.
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.660 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.664 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:38.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:38 np0005588920 nova_compute[226886]: 2026-01-20 14:34:38.682 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:34:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:38.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.144 226890 DEBUG nova.compute.manager [req-3f450be0-70bc-470c-a967-d49cc4f2e221 req-6187d80d-0c18-44e4-8468-24dbd83d315c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.145 226890 DEBUG oslo_concurrency.lockutils [req-3f450be0-70bc-470c-a967-d49cc4f2e221 req-6187d80d-0c18-44e4-8468-24dbd83d315c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.146 226890 DEBUG oslo_concurrency.lockutils [req-3f450be0-70bc-470c-a967-d49cc4f2e221 req-6187d80d-0c18-44e4-8468-24dbd83d315c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.146 226890 DEBUG oslo_concurrency.lockutils [req-3f450be0-70bc-470c-a967-d49cc4f2e221 req-6187d80d-0c18-44e4-8468-24dbd83d315c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.147 226890 DEBUG nova.compute.manager [req-3f450be0-70bc-470c-a967-d49cc4f2e221 req-6187d80d-0c18-44e4-8468-24dbd83d315c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Processing event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.148 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.151 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919679.1509058, 39013d10-cf09-4fc7-826c-99746ff0eb68 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.152 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.153 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.156 226890 INFO nova.virt.libvirt.driver [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Instance spawned successfully.#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.156 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.184 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.190 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.193 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.194 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.194 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.195 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.196 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.196 226890 DEBUG nova.virt.libvirt.driver [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.227 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.272 226890 INFO nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Took 7.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.273 226890 DEBUG nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.355 226890 INFO nova.compute.manager [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Took 8.76 seconds to build instance.#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.370 226890 DEBUG oslo_concurrency.lockutils [None req-4a8b9d0e-f057-4bde-aa24-087139ca2e6b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:39 np0005588920 nova_compute[226886]: 2026-01-20 14:34:39.528 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.396 226890 DEBUG nova.compute.manager [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.396 226890 DEBUG oslo_concurrency.lockutils [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.397 226890 DEBUG oslo_concurrency.lockutils [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.397 226890 DEBUG oslo_concurrency.lockutils [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.397 226890 DEBUG nova.compute.manager [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] No waiting events found dispatching network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.397 226890 WARNING nova.compute.manager [req-131d16b0-d1a1-49db-9b6c-457183d90544 req-c9726189-4a54-45f1-a365-cb3621279aed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received unexpected event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.747 226890 INFO nova.compute.manager [None req-1840619c-dcc0-46b5-b4c1-d5cf185fb13e 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Pausing#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.748 226890 DEBUG nova.objects.instance [None req-1840619c-dcc0-46b5-b4c1-d5cf185fb13e 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'flavor' on Instance uuid 39013d10-cf09-4fc7-826c-99746ff0eb68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.777 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919681.7771256, 39013d10-cf09-4fc7-826c-99746ff0eb68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.777 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.778 226890 DEBUG nova.compute.manager [None req-1840619c-dcc0-46b5-b4c1-d5cf185fb13e 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.810 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.812 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.842 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 20 09:34:41 np0005588920 nova_compute[226886]: 2026-01-20 14:34:41.875 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:42.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:42.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:44 np0005588920 podman[243075]: 2026-01-20 14:34:44.075316643 +0000 UTC m=+0.155857980 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:34:44 np0005588920 nova_compute[226886]: 2026-01-20 14:34:44.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:44.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:44.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:45 np0005588920 nova_compute[226886]: 2026-01-20 14:34:45.007 226890 DEBUG nova.compute.manager [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:45 np0005588920 nova_compute[226886]: 2026-01-20 14:34:45.094 226890 INFO nova.compute.manager [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] instance snapshotting#033[00m
Jan 20 09:34:45 np0005588920 nova_compute[226886]: 2026-01-20 14:34:45.095 226890 WARNING nova.compute.manager [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Jan 20 09:34:45 np0005588920 nova_compute[226886]: 2026-01-20 14:34:45.391 226890 INFO nova.virt.libvirt.driver [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Beginning live snapshot process#033[00m
Jan 20 09:34:45 np0005588920 nova_compute[226886]: 2026-01-20 14:34:45.557 226890 DEBUG nova.virt.libvirt.imagebackend [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:34:46 np0005588920 nova_compute[226886]: 2026-01-20 14:34:46.009 226890 DEBUG nova.storage.rbd_utils [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(cd6eed50f02e4caab7e9e4ad69198209) on rbd image(39013d10-cf09-4fc7-826c-99746ff0eb68_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:34:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:46.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 20 09:34:46 np0005588920 nova_compute[226886]: 2026-01-20 14:34:46.880 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:46 np0005588920 nova_compute[226886]: 2026-01-20 14:34:46.978 226890 DEBUG nova.storage.rbd_utils [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] cloning vms/39013d10-cf09-4fc7-826c-99746ff0eb68_disk@cd6eed50f02e4caab7e9e4ad69198209 to images/74b6b72a-3e89-4e91-95cd-b3413cc32773 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:34:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:46.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:47 np0005588920 nova_compute[226886]: 2026-01-20 14:34:47.144 226890 DEBUG nova.storage.rbd_utils [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] flattening images/74b6b72a-3e89-4e91-95cd-b3413cc32773 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:34:47 np0005588920 nova_compute[226886]: 2026-01-20 14:34:47.496 226890 DEBUG nova.storage.rbd_utils [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] removing snapshot(cd6eed50f02e4caab7e9e4ad69198209) on rbd image(39013d10-cf09-4fc7-826c-99746ff0eb68_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:34:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 20 09:34:47 np0005588920 nova_compute[226886]: 2026-01-20 14:34:47.908 226890 DEBUG nova.storage.rbd_utils [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(snap) on rbd image(74b6b72a-3e89-4e91-95cd-b3413cc32773) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:34:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:48.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 20 09:34:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:48.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:49 np0005588920 nova_compute[226886]: 2026-01-20 14:34:49.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:50 np0005588920 nova_compute[226886]: 2026-01-20 14:34:50.546 226890 INFO nova.virt.libvirt.driver [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Snapshot image upload complete#033[00m
Jan 20 09:34:50 np0005588920 nova_compute[226886]: 2026-01-20 14:34:50.546 226890 INFO nova.compute.manager [None req-c198b8b1-a0fb-480c-91cb-547fafc91777 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Took 5.45 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:34:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:50.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:50.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 20 09:34:51 np0005588920 nova_compute[226886]: 2026-01-20 14:34:51.944 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 20 09:34:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 20 09:34:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:52.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:53.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.106 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.106 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.107 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.107 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.107 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.109 226890 INFO nova.compute.manager [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Terminating instance#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.111 226890 DEBUG nova.compute.manager [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:34:53 np0005588920 kernel: tap3e4b13e0-65 (unregistering): left promiscuous mode
Jan 20 09:34:53 np0005588920 NetworkManager[49076]: <info>  [1768919693.1570] device (tap3e4b13e0-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:34:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:53Z|00113|binding|INFO|Releasing lport 3e4b13e0-655b-41b9-8274-940d2a5cdf49 from this chassis (sb_readonly=0)
Jan 20 09:34:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:53Z|00114|binding|INFO|Setting lport 3e4b13e0-655b-41b9-8274-940d2a5cdf49 down in Southbound
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.168 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:34:53Z|00115|binding|INFO|Removing iface tap3e4b13e0-65 ovn-installed in OVS
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.179 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:d7:49 10.100.0.12'], port_security=['fa:16:3e:50:d7:49 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39013d10-cf09-4fc7-826c-99746ff0eb68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=3e4b13e0-655b-41b9-8274-940d2a5cdf49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.181 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 3e4b13e0-655b-41b9-8274-940d2a5cdf49 in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a unbound from our chassis#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.182 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network abb83e3e-0b12-431b-ad86-a1d271b5b46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.183 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[251bb2b0-92ea-425e-b8fb-dcad3c23ff7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.184 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace which is not needed anymore#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.185 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Jan 20 09:34:53 np0005588920 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002c.scope: Consumed 3.416s CPU time.
Jan 20 09:34:53 np0005588920 systemd-machined[196121]: Machine qemu-19-instance-0000002c terminated.
Jan 20 09:34:53 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [NOTICE]   (243063) : haproxy version is 2.8.14-c23fe91
Jan 20 09:34:53 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [NOTICE]   (243063) : path to executable is /usr/sbin/haproxy
Jan 20 09:34:53 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [WARNING]  (243063) : Exiting Master process...
Jan 20 09:34:53 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [ALERT]    (243063) : Current worker (243065) exited with code 143 (Terminated)
Jan 20 09:34:53 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243059]: [WARNING]  (243063) : All workers exited. Exiting... (0)
Jan 20 09:34:53 np0005588920 systemd[1]: libpod-b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c.scope: Deactivated successfully.
Jan 20 09:34:53 np0005588920 podman[243399]: 2026-01-20 14:34:53.319365946 +0000 UTC m=+0.046976628 container died b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:34:53 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:34:53 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:34:53 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:34:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c-userdata-shm.mount: Deactivated successfully.
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.347 226890 INFO nova.virt.libvirt.driver [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Instance destroyed successfully.#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.347 226890 DEBUG nova.objects.instance [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'resources' on Instance uuid 39013d10-cf09-4fc7-826c-99746ff0eb68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d4c1078d1353a0ef32c417ac6af2685511647dcabb3cadf43a782184d82523ba-merged.mount: Deactivated successfully.
Jan 20 09:34:53 np0005588920 podman[243399]: 2026-01-20 14:34:53.36588871 +0000 UTC m=+0.093499362 container cleanup b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.368 226890 DEBUG nova.virt.libvirt.vif [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:34:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2138888348',display_name='tempest-ImagesTestJSON-server-2138888348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2138888348',id=44,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:34:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-i9x5sh3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:34:50Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=39013d10-cf09-4fc7-826c-99746ff0eb68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.369 226890 DEBUG nova.network.os_vif_util [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "address": "fa:16:3e:50:d7:49", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4b13e0-65", "ovs_interfaceid": "3e4b13e0-655b-41b9-8274-940d2a5cdf49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.370 226890 DEBUG nova.network.os_vif_util [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.370 226890 DEBUG os_vif [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:34:53 np0005588920 systemd[1]: libpod-conmon-b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c.scope: Deactivated successfully.
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.373 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.373 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e4b13e0-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.378 226890 INFO os_vif [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:d7:49,bridge_name='br-int',has_traffic_filtering=True,id=3e4b13e0-655b-41b9-8274-940d2a5cdf49,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4b13e0-65')#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.394 226890 DEBUG nova.compute.manager [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-unplugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.394 226890 DEBUG oslo_concurrency.lockutils [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.394 226890 DEBUG oslo_concurrency.lockutils [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.394 226890 DEBUG oslo_concurrency.lockutils [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.395 226890 DEBUG nova.compute.manager [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] No waiting events found dispatching network-vif-unplugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.395 226890 DEBUG nova.compute.manager [req-8d3b8e22-c2ee-4cad-bb28-54e64a00b744 req-0b7bd40e-703f-4309-b37a-b69adfdbcb7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-unplugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:34:53 np0005588920 podman[243437]: 2026-01-20 14:34:53.429146174 +0000 UTC m=+0.041324300 container remove b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.434 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[450c6dd6-dbc5-4b91-9797-ea4b2ca691a0]: (4, ('Tue Jan 20 02:34:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c)\nb9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c\nTue Jan 20 02:34:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (b9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c)\nb9fc44f08845a2afd35a5984015c83f49ad9030446754b9bf9b04af01e6eb99c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.436 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d3d5f4-c3c6-45f4-935e-8ea771353b0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.438 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.439 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 kernel: tapabb83e3e-00: left promiscuous mode
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.457 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c0580c98-ff88-4aa6-90e3-53d9646138df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.472 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b47152ec-c3a6-49f9-8b93-d852bda57d15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.473 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9fa174-4137-46fd-96e2-d5a3c9acf14f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.492 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3a0d98-233e-41ef-891d-311584e3097d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471963, 'reachable_time': 24565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243470, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 systemd[1]: run-netns-ovnmeta\x2dabb83e3e\x2d0b12\x2d431b\x2dad86\x2da1d271b5b46a.mount: Deactivated successfully.
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.496 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:34:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:34:53.496 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[4aef6c5f-3246-4d2e-aecd-415bd3663b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.721 226890 INFO nova.virt.libvirt.driver [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Deleting instance files /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68_del#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.722 226890 INFO nova.virt.libvirt.driver [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Deletion of /var/lib/nova/instances/39013d10-cf09-4fc7-826c-99746ff0eb68_del complete#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.862 226890 INFO nova.compute.manager [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.863 226890 DEBUG oslo.service.loopingcall [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.863 226890 DEBUG nova.compute.manager [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:34:53 np0005588920 nova_compute[226886]: 2026-01-20 14:34:53.864 226890 DEBUG nova.network.neutron [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:34:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.535 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:54.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.830 226890 DEBUG nova.network.neutron [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.849 226890 INFO nova.compute.manager [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Took 0.99 seconds to deallocate network for instance.#033[00m
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.908 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.909 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.920 226890 DEBUG nova.compute.manager [req-cf183d72-cbca-483c-a260-a48354340a02 req-d90ed061-825c-4338-bc2e-e0f731047816 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-deleted-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:54 np0005588920 nova_compute[226886]: 2026-01-20 14:34:54.985 226890 DEBUG oslo_concurrency.processutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:55 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2671332732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.433 226890 DEBUG oslo_concurrency.processutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.440 226890 DEBUG nova.compute.provider_tree [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.457 226890 DEBUG nova.compute.manager [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.457 226890 DEBUG oslo_concurrency.lockutils [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.457 226890 DEBUG oslo_concurrency.lockutils [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.458 226890 DEBUG oslo_concurrency.lockutils [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.458 226890 DEBUG nova.compute.manager [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] No waiting events found dispatching network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.458 226890 WARNING nova.compute.manager [req-fe8d21d0-7f01-4aed-9a06-b82297eeb37a req-f9642c50-7186-4920-9853-d97ee7a77070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Received unexpected event network-vif-plugged-3e4b13e0-655b-41b9-8274-940d2a5cdf49 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.460 226890 DEBUG nova.scheduler.client.report [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.479 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.530 226890 INFO nova.scheduler.client.report [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Deleted allocations for instance 39013d10-cf09-4fc7-826c-99746ff0eb68#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.610 226890 DEBUG oslo_concurrency.lockutils [None req-7af3582d-6391-466a-ada0-bcf68472fc3f 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "39013d10-cf09-4fc7-826c-99746ff0eb68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.963 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.964 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:55 np0005588920 nova_compute[226886]: 2026-01-20 14:34:55.989 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.065 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.065 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.072 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.072 226890 INFO nova.compute.claims [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.208 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3288395047' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.632 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.639 226890 DEBUG nova.compute.provider_tree [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.656 226890 DEBUG nova.scheduler.client.report [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.676 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.677 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:34:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:56.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.719 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.719 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.745 226890 INFO nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.763 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.860 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.861 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.862 226890 INFO nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Creating image(s)#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.889 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.918 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.947 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.952 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:56 np0005588920 podman[243535]: 2026-01-20 14:34:56.958189555 +0000 UTC m=+0.049688164 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:34:56 np0005588920 nova_compute[226886]: 2026-01-20 14:34:56.978 226890 DEBUG nova.policy [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56e2959629114d3d8a48e7a80ed96c4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3750c56415134773aa9d9880038f1749', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:34:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:57.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.029 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.030 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.031 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.031 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.057 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.061 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e22c5447-900e-45da-b2af-46423bc1e2d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.327 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e22c5447-900e-45da-b2af-46423bc1e2d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.404 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] resizing rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.513 226890 DEBUG nova.objects.instance [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'migration_context' on Instance uuid e22c5447-900e-45da-b2af-46423bc1e2d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.549 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.550 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Ensure instance console log exists: /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.551 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.551 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.551 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:57 np0005588920 nova_compute[226886]: 2026-01-20 14:34:57.845 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Successfully created port: 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:34:58 np0005588920 nova_compute[226886]: 2026-01-20 14:34:58.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:58.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:34:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:59.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.133 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Successfully updated port: 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.152 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.152 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquired lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.152 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.255 226890 DEBUG nova.compute.manager [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-changed-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.256 226890 DEBUG nova.compute.manager [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Refreshing instance network info cache due to event network-changed-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.256 226890 DEBUG oslo_concurrency.lockutils [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.367 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:34:59 np0005588920 nova_compute[226886]: 2026-01-20 14:34:59.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:35:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:35:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:00.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.780 226890 DEBUG nova.network.neutron [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Updating instance_info_cache with network_info: [{"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.808 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Releasing lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.809 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance network_info: |[{"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.809 226890 DEBUG oslo_concurrency.lockutils [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.809 226890 DEBUG nova.network.neutron [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Refreshing network info cache for port 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.811 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Start _get_guest_xml network_info=[{"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.816 226890 WARNING nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.821 226890 DEBUG nova.virt.libvirt.host [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.822 226890 DEBUG nova.virt.libvirt.host [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.825 226890 DEBUG nova.virt.libvirt.host [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.825 226890 DEBUG nova.virt.libvirt.host [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.826 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.826 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.827 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.827 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.827 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.827 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.828 226890 DEBUG nova.virt.hardware [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:35:00 np0005588920 nova_compute[226886]: 2026-01-20 14:35:00.831 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:01.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2503250766' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.325 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.349 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.353 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3622645078' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.754 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.756 226890 DEBUG nova.virt.libvirt.vif [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1596237958',display_name='tempest-ImagesTestJSON-server-1596237958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1596237958',id=45,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-3za8g4v3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:56Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=e22c5447-900e-45da-b2af-46423bc1e2d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.756 226890 DEBUG nova.network.os_vif_util [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.757 226890 DEBUG nova.network.os_vif_util [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.759 226890 DEBUG nova.objects.instance [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'pci_devices' on Instance uuid e22c5447-900e-45da-b2af-46423bc1e2d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.778 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <uuid>e22c5447-900e-45da-b2af-46423bc1e2d8</uuid>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <name>instance-0000002d</name>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:name>tempest-ImagesTestJSON-server-1596237958</nova:name>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:35:00</nova:creationTime>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:user uuid="56e2959629114d3d8a48e7a80ed96c4b">tempest-ImagesTestJSON-338390217-project-member</nova:user>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:project uuid="3750c56415134773aa9d9880038f1749">tempest-ImagesTestJSON-338390217</nova:project>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <nova:port uuid="3f3dec34-7223-4718-8c4b-bf4fdfddf3ef">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="serial">e22c5447-900e-45da-b2af-46423bc1e2d8</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="uuid">e22c5447-900e-45da-b2af-46423bc1e2d8</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e22c5447-900e-45da-b2af-46423bc1e2d8_disk">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:e9:83:16"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <target dev="tap3f3dec34-72"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/console.log" append="off"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:35:01 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:35:01 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:35:01 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:35:01 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.780 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Preparing to wait for external event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.781 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.781 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.781 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.782 226890 DEBUG nova.virt.libvirt.vif [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1596237958',display_name='tempest-ImagesTestJSON-server-1596237958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1596237958',id=45,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-3za8g4v3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:56Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=e22c5447-900e-45da-b2af-46423bc1e2d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.783 226890 DEBUG nova.network.os_vif_util [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.784 226890 DEBUG nova.network.os_vif_util [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.784 226890 DEBUG os_vif [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.785 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.786 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.789 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.789 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f3dec34-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.790 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3f3dec34-72, col_values=(('external_ids', {'iface-id': '3f3dec34-7223-4718-8c4b-bf4fdfddf3ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:83:16', 'vm-uuid': 'e22c5447-900e-45da-b2af-46423bc1e2d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.840 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:01 np0005588920 NetworkManager[49076]: <info>  [1768919701.8413] manager: (tap3f3dec34-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.842 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.845 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.845 226890 INFO os_vif [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72')#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.905 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.906 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.906 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No VIF found with MAC fa:16:3e:e9:83:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.906 226890 INFO nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Using config drive#033[00m
Jan 20 09:35:01 np0005588920 nova_compute[226886]: 2026-01-20 14:35:01.934 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.465 226890 DEBUG nova.network.neutron [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Updated VIF entry in instance network info cache for port 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.466 226890 DEBUG nova.network.neutron [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Updating instance_info_cache with network_info: [{"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.473 226890 INFO nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Creating config drive at /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.482 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8v0ffsu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.513 226890 DEBUG oslo_concurrency.lockutils [req-eb288455-8559-40da-b272-231f2f68e364 req-06220174-1bd2-4781-beda-a7b1e8245d6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-e22c5447-900e-45da-b2af-46423bc1e2d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.617 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8v0ffsu" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.660 226890 DEBUG nova.storage.rbd_utils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.666 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:02.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.837 226890 DEBUG oslo_concurrency.processutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config e22c5447-900e-45da-b2af-46423bc1e2d8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.838 226890 INFO nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Deleting local config drive /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8/disk.config because it was imported into RBD.#033[00m
Jan 20 09:35:02 np0005588920 kernel: tap3f3dec34-72: entered promiscuous mode
Jan 20 09:35:02 np0005588920 NetworkManager[49076]: <info>  [1768919702.8810] manager: (tap3f3dec34-72): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 20 09:35:02 np0005588920 systemd-udevd[243889]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:35:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:02Z|00116|binding|INFO|Claiming lport 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef for this chassis.
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:02Z|00117|binding|INFO|3f3dec34-7223-4718-8c4b-bf4fdfddf3ef: Claiming fa:16:3e:e9:83:16 10.100.0.6
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.935 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:83:16 10.100.0.6'], port_security=['fa:16:3e:e9:83:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e22c5447-900e-45da-b2af-46423bc1e2d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.936 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a bound to our chassis#033[00m
Jan 20 09:35:02 np0005588920 NetworkManager[49076]: <info>  [1768919702.9393] device (tap3f3dec34-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:35:02 np0005588920 NetworkManager[49076]: <info>  [1768919702.9397] device (tap3f3dec34-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.938 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network abb83e3e-0b12-431b-ad86-a1d271b5b46a#033[00m
Jan 20 09:35:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:02Z|00118|binding|INFO|Setting lport 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef ovn-installed in OVS
Jan 20 09:35:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:02Z|00119|binding|INFO|Setting lport 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef up in Southbound
Jan 20 09:35:02 np0005588920 nova_compute[226886]: 2026-01-20 14:35:02.946 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.950 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba36529-1a86-4849-ae0c-e734ff79770a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.951 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapabb83e3e-01 in ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.953 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapabb83e3e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.953 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f234249-6a04-4633-8e0d-095d320d3ec2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.954 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[79c4b328-c871-481c-911f-5e43dbeb1b52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:02 np0005588920 systemd-machined[196121]: New machine qemu-20-instance-0000002d.
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.964 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[717296dc-563e-4bfb-b8b9-968ec6476024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:02 np0005588920 systemd[1]: Started Virtual Machine qemu-20-instance-0000002d.
Jan 20 09:35:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:02.987 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a20440-e539-41bf-9af2-59102414d14a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.014 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e606f1e9-c45d-4b66-8dbe-cb742cdf032b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:03.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:03 np0005588920 systemd-udevd[243892]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:35:03 np0005588920 NetworkManager[49076]: <info>  [1768919703.0207] manager: (tapabb83e3e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.020 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d0362438-c757-4f2a-8baa-d8d70531d02e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.050 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[51f978ba-ac35-4506-a217-305ea4074c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.053 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0b16ec96-0c65-48c0-bb05-7bade08bdce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 NetworkManager[49076]: <info>  [1768919703.0740] device (tapabb83e3e-00): carrier: link connected
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.078 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a29669-96cb-4042-ad61-a47bdf62c2ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.097 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0454e220-5aa7-4df8-a43d-af0d369b16da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474476, 'reachable_time': 37568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243925, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.116 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[72b8de5e-a664-4728-a428-fd86351e0c9c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474476, 'tstamp': 474476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243926, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.140 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[55ed8f12-26ce-419c-9c27-8d759bcd6fa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474476, 'reachable_time': 37568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243927, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.174 226890 DEBUG nova.compute.manager [req-d6a0dc26-69f7-4b3a-adc5-d737802a7089 req-fd5c3e69-005c-4bc7-af98-20df1889a4da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.175 226890 DEBUG oslo_concurrency.lockutils [req-d6a0dc26-69f7-4b3a-adc5-d737802a7089 req-fd5c3e69-005c-4bc7-af98-20df1889a4da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.175 226890 DEBUG oslo_concurrency.lockutils [req-d6a0dc26-69f7-4b3a-adc5-d737802a7089 req-fd5c3e69-005c-4bc7-af98-20df1889a4da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.176 226890 DEBUG oslo_concurrency.lockutils [req-d6a0dc26-69f7-4b3a-adc5-d737802a7089 req-fd5c3e69-005c-4bc7-af98-20df1889a4da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.176 226890 DEBUG nova.compute.manager [req-d6a0dc26-69f7-4b3a-adc5-d737802a7089 req-fd5c3e69-005c-4bc7-af98-20df1889a4da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Processing event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.178 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[79c28c60-adbe-4d04-98f0-f46bb846110e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.245 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1e6433-8535-4e85-9d28-2fdb08c41fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.247 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.247 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.248 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabb83e3e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:03 np0005588920 kernel: tapabb83e3e-00: entered promiscuous mode
Jan 20 09:35:03 np0005588920 NetworkManager[49076]: <info>  [1768919703.2507] manager: (tapabb83e3e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.259 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapabb83e3e-00, col_values=(('external_ids', {'iface-id': 'dfacaf19-f896-4c13-a7ad-47b57cf03fc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:03Z|00120|binding|INFO|Releasing lport dfacaf19-f896-4c13-a7ad-47b57cf03fc1 from this chassis (sb_readonly=0)
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.261 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.263 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.264 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb2e76e-5152-474d-8258-867b686e8b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.265 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:35:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:03.267 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'env', 'PROCESS_TAG=haproxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/abb83e3e-0b12-431b-ad86-a1d271b5b46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:03 np0005588920 podman[243959]: 2026-01-20 14:35:03.60287731 +0000 UTC m=+0.044941881 container create f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:35:03 np0005588920 systemd[1]: Started libpod-conmon-f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1.scope.
Jan 20 09:35:03 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:35:03 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bc360de7ee8c5f67acdc039ed29aeb0f1f5df9711cad7e48c743a2ed340cd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:35:03 np0005588920 podman[243959]: 2026-01-20 14:35:03.674891529 +0000 UTC m=+0.116956130 container init f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:35:03 np0005588920 podman[243959]: 2026-01-20 14:35:03.579768322 +0000 UTC m=+0.021832923 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:35:03 np0005588920 podman[243959]: 2026-01-20 14:35:03.681087623 +0000 UTC m=+0.123152194 container start f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:35:03 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [NOTICE]   (243978) : New worker (243980) forked
Jan 20 09:35:03 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [NOTICE]   (243978) : Loading success.
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.973 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.976 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919703.972919, e22c5447-900e-45da-b2af-46423bc1e2d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.976 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] VM Started (Lifecycle Event)#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.980 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.983 226890 INFO nova.virt.libvirt.driver [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance spawned successfully.#033[00m
Jan 20 09:35:03 np0005588920 nova_compute[226886]: 2026-01-20 14:35:03.983 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.007 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.013 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.015 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.016 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.016 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.016 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.017 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.017 226890 DEBUG nova.virt.libvirt.driver [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.041 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.041 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919703.9743788, e22c5447-900e-45da-b2af-46423bc1e2d8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.042 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.057 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.060 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919703.978734, e22c5447-900e-45da-b2af-46423bc1e2d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.060 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.100 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.103 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.109 226890 INFO nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Took 7.25 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.110 226890 DEBUG nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.134 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.174 226890 INFO nova.compute.manager [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Took 8.13 seconds to build instance.#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.191 226890 DEBUG oslo_concurrency.lockutils [None req-33d8f3e2-0f7c-44b9-8326-a2aed111924d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:04 np0005588920 nova_compute[226886]: 2026-01-20 14:35:04.538 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:04.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:05.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.222 226890 DEBUG oslo_concurrency.lockutils [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.222 226890 DEBUG oslo_concurrency.lockutils [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.223 226890 DEBUG nova.compute.manager [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.226 226890 DEBUG nova.compute.manager [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.226 226890 DEBUG nova.objects.instance [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'flavor' on Instance uuid e22c5447-900e-45da-b2af-46423bc1e2d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.252 226890 DEBUG nova.virt.libvirt.driver [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.268 226890 DEBUG nova.compute.manager [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.268 226890 DEBUG oslo_concurrency.lockutils [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.268 226890 DEBUG oslo_concurrency.lockutils [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.268 226890 DEBUG oslo_concurrency.lockutils [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.268 226890 DEBUG nova.compute.manager [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] No waiting events found dispatching network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:05 np0005588920 nova_compute[226886]: 2026-01-20 14:35:05.269 226890 WARNING nova.compute.manager [req-d20d8cc5-a9d2-457d-853a-73835d185cd7 req-09ab70bd-ba0a-45d3-a79a-271172757942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received unexpected event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef for instance with vm_state active and task_state powering-off.#033[00m
Jan 20 09:35:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 20 09:35:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:06 np0005588920 nova_compute[226886]: 2026-01-20 14:35:06.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:07.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:07 np0005588920 nova_compute[226886]: 2026-01-20 14:35:07.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:07.210 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:07.211 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:35:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:07.211 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:08 np0005588920 nova_compute[226886]: 2026-01-20 14:35:08.345 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919693.344259, 39013d10-cf09-4fc7-826c-99746ff0eb68 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:08 np0005588920 nova_compute[226886]: 2026-01-20 14:35:08.345 226890 INFO nova.compute.manager [-] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:08 np0005588920 nova_compute[226886]: 2026-01-20 14:35:08.364 226890 DEBUG nova.compute.manager [None req-6aa4307c-6e2b-4151-b580-8bdd55436c5b - - - - - -] [instance: 39013d10-cf09-4fc7-826c-99746ff0eb68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:08.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 20 09:35:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:09 np0005588920 nova_compute[226886]: 2026-01-20 14:35:09.540 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 20 09:35:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:10.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:11 np0005588920 nova_compute[226886]: 2026-01-20 14:35:11.845 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.281 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.282 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.310 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.382 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.382 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.397 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.398 226890 INFO nova.compute.claims [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:35:12 np0005588920 nova_compute[226886]: 2026-01-20 14:35:12.552 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:12.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4216979389' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:13.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.031 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.039 226890 DEBUG nova.compute.provider_tree [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.059 226890 DEBUG nova.scheduler.client.report [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.085 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.085 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.128 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.128 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.155 226890 INFO nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.182 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.267 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.269 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.269 226890 INFO nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Creating image(s)#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.319 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.362 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.393 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.397 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.431 226890 DEBUG nova.policy [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6bf23282febb455daf4d4f24666cd6c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a73bd836c7f64377a24971d95d583639', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.487 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.488 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.488 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.488 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.508 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.511 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 06711d06-2cb8-4aa0-a787-db6d71e98029_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/52686846' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:35:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/52686846' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.771 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 06711d06-2cb8-4aa0-a787-db6d71e98029_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:13 np0005588920 nova_compute[226886]: 2026-01-20 14:35:13.844 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] resizing rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.156 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Successfully created port: 78b6999c-7e47-4732-a9f5-e2b099b471f9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.312 226890 DEBUG nova.objects.instance [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lazy-loading 'migration_context' on Instance uuid 06711d06-2cb8-4aa0-a787-db6d71e98029 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.327 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.327 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Ensure instance console log exists: /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.328 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.328 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.328 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:14 np0005588920 nova_compute[226886]: 2026-01-20 14:35:14.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:14.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:15 np0005588920 podman[244221]: 2026-01-20 14:35:15.027916149 +0000 UTC m=+0.110421436 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:35:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:15.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.074 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Successfully updated port: 78b6999c-7e47-4732-a9f5-e2b099b471f9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.088 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.088 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquired lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.088 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:35:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.289 226890 DEBUG nova.compute.manager [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-changed-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.290 226890 DEBUG nova.compute.manager [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Refreshing instance network info cache due to event network-changed-78b6999c-7e47-4732-a9f5-e2b099b471f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.290 226890 DEBUG oslo_concurrency.lockutils [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.292 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:35:15 np0005588920 nova_compute[226886]: 2026-01-20 14:35:15.445 226890 DEBUG nova.virt.libvirt.driver [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:35:16 np0005588920 nova_compute[226886]: 2026-01-20 14:35:16.376 226890 DEBUG nova.network.neutron [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Updating instance_info_cache with network_info: [{"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:16.437 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:16.437 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:16.438 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:16.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:16 np0005588920 nova_compute[226886]: 2026-01-20 14:35:16.849 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:17Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:83:16 10.100.0.6
Jan 20 09:35:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:17Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:83:16 10.100.0.6
Jan 20 09:35:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.676 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Releasing lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.676 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Instance network_info: |[{"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.677 226890 DEBUG oslo_concurrency.lockutils [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.677 226890 DEBUG nova.network.neutron [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Refreshing network info cache for port 78b6999c-7e47-4732-a9f5-e2b099b471f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.680 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Start _get_guest_xml network_info=[{"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.684 226890 WARNING nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.688 226890 DEBUG nova.virt.libvirt.host [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.689 226890 DEBUG nova.virt.libvirt.host [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.692 226890 DEBUG nova.virt.libvirt.host [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.692 226890 DEBUG nova.virt.libvirt.host [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.693 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.693 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.694 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.694 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.694 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.694 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.695 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.695 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.695 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.695 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.696 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.696 226890 DEBUG nova.virt.hardware [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:35:17 np0005588920 nova_compute[226886]: 2026-01-20 14:35:17.698 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3187377245' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.174 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.202 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.206 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 20 09:35:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1954847869' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.640 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.642 226890 DEBUG nova.virt.libvirt.vif [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:35:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1396811540',display_name='tempest-ImagesNegativeTestJSON-server-1396811540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1396811540',id=48,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a73bd836c7f64377a24971d95d583639',ramdisk_id='',reservation_id='r-83m77gox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1238318859',owner_user_name='tempest-ImagesNegativeTestJSON-1238318859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:35:13Z,user_data=None,user_id='6bf23282febb455daf4d4f24666cd6c3',uuid=06711d06-2cb8-4aa0-a787-db6d71e98029,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.642 226890 DEBUG nova.network.os_vif_util [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converting VIF {"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.643 226890 DEBUG nova.network.os_vif_util [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.644 226890 DEBUG nova.objects.instance [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lazy-loading 'pci_devices' on Instance uuid 06711d06-2cb8-4aa0-a787-db6d71e98029 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.660 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <uuid>06711d06-2cb8-4aa0-a787-db6d71e98029</uuid>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <name>instance-00000030</name>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:name>tempest-ImagesNegativeTestJSON-server-1396811540</nova:name>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:35:17</nova:creationTime>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:user uuid="6bf23282febb455daf4d4f24666cd6c3">tempest-ImagesNegativeTestJSON-1238318859-project-member</nova:user>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:project uuid="a73bd836c7f64377a24971d95d583639">tempest-ImagesNegativeTestJSON-1238318859</nova:project>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <nova:port uuid="78b6999c-7e47-4732-a9f5-e2b099b471f9">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="serial">06711d06-2cb8-4aa0-a787-db6d71e98029</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="uuid">06711d06-2cb8-4aa0-a787-db6d71e98029</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/06711d06-2cb8-4aa0-a787-db6d71e98029_disk">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:96:df:e0"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <target dev="tap78b6999c-7e"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/console.log" append="off"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:35:18 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:35:18 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:35:18 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:35:18 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.661 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Preparing to wait for external event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.662 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.662 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.662 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.663 226890 DEBUG nova.virt.libvirt.vif [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:35:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1396811540',display_name='tempest-ImagesNegativeTestJSON-server-1396811540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1396811540',id=48,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a73bd836c7f64377a24971d95d583639',ramdisk_id='',reservation_id='r-83m77gox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1238318859',owner_user_name='tempest-ImagesNegativeTestJSON-1238318859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:35:13Z,user_data=None,user_id='6bf23282febb455daf4d4f24666cd6c3',uuid=06711d06-2cb8-4aa0-a787-db6d71e98029,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.663 226890 DEBUG nova.network.os_vif_util [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converting VIF {"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.664 226890 DEBUG nova.network.os_vif_util [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.664 226890 DEBUG os_vif [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.665 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.666 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.666 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.668 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.668 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap78b6999c-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.669 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap78b6999c-7e, col_values=(('external_ids', {'iface-id': '78b6999c-7e47-4732-a9f5-e2b099b471f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:df:e0', 'vm-uuid': '06711d06-2cb8-4aa0-a787-db6d71e98029'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.670 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:18 np0005588920 NetworkManager[49076]: <info>  [1768919718.6714] manager: (tap78b6999c-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.673 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.676 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.677 226890 INFO os_vif [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e')#033[00m
Jan 20 09:35:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:18.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.822 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.822 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.822 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] No VIF found with MAC fa:16:3e:96:df:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.822 226890 INFO nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Using config drive#033[00m
Jan 20 09:35:18 np0005588920 nova_compute[226886]: 2026-01-20 14:35:18.852 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:19.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.424 226890 DEBUG nova.network.neutron [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Updated VIF entry in instance network info cache for port 78b6999c-7e47-4732-a9f5-e2b099b471f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.425 226890 DEBUG nova.network.neutron [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Updating instance_info_cache with network_info: [{"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.454 226890 DEBUG oslo_concurrency.lockutils [req-3cf4caad-dd2d-4732-a5b4-1b8b5cbb552a req-cce6e826-749b-4dd5-b158-088fe3e5ef8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-06711d06-2cb8-4aa0-a787-db6d71e98029" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.479 226890 INFO nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Creating config drive at /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.484 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwlshfq3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.616 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwlshfq3" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.647 226890 DEBUG nova.storage.rbd_utils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] rbd image 06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.652 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config 06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.884 226890 DEBUG oslo_concurrency.processutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config 06711d06-2cb8-4aa0-a787-db6d71e98029_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.885 226890 INFO nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Deleting local config drive /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029/disk.config because it was imported into RBD.#033[00m
Jan 20 09:35:19 np0005588920 kernel: tap78b6999c-7e: entered promiscuous mode
Jan 20 09:35:19 np0005588920 NetworkManager[49076]: <info>  [1768919719.9285] manager: (tap78b6999c-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 20 09:35:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:19Z|00121|binding|INFO|Claiming lport 78b6999c-7e47-4732-a9f5-e2b099b471f9 for this chassis.
Jan 20 09:35:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:19Z|00122|binding|INFO|78b6999c-7e47-4732-a9f5-e2b099b471f9: Claiming fa:16:3e:96:df:e0 10.100.0.4
Jan 20 09:35:19 np0005588920 nova_compute[226886]: 2026-01-20 14:35:19.929 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.940 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:df:e0 10.100.0.4'], port_security=['fa:16:3e:96:df:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '06711d06-2cb8-4aa0-a787-db6d71e98029', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a73bd836c7f64377a24971d95d583639', 'neutron:revision_number': '2', 'neutron:security_group_ids': '84362096-42e2-4bb5-9db5-ce10e58087a3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70f705e1-8f62-4abd-8cf0-07de3d16048d, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=78b6999c-7e47-4732-a9f5-e2b099b471f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.941 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 78b6999c-7e47-4732-a9f5-e2b099b471f9 in datapath f9e083ff-afa9-4e67-8c9b-f18349d0d534 bound to our chassis#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.942 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9e083ff-afa9-4e67-8c9b-f18349d0d534#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.951 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f38d6c32-6424-4f86-a65d-5aa28841467a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.952 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9e083ff-a1 in ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.954 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9e083ff-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.954 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b981db-edc7-4b3a-9fc5-8aa4aaac85f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.955 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9b328e-d490-4538-a117-3a32d5512c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:19 np0005588920 systemd-machined[196121]: New machine qemu-21-instance-00000030.
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.966 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[ba794d14-89c0-4769-b818-369c86688253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:19.981 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c79ba8a9-b78b-48f6-82d6-22c30bf467bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:19 np0005588920 systemd[1]: Started Virtual Machine qemu-21-instance-00000030.
Jan 20 09:35:20 np0005588920 systemd-udevd[244390]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.010 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e34cfd-b0e6-4851-b085-8052dfdc0303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 NetworkManager[49076]: <info>  [1768919720.0143] device (tap78b6999c-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.015 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a46ed4-d751-4ac3-8a40-d7b0e7b623d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.015 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 NetworkManager[49076]: <info>  [1768919720.0179] manager: (tapf9e083ff-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 20 09:35:20 np0005588920 NetworkManager[49076]: <info>  [1768919720.0184] device (tap78b6999c-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:35:20 np0005588920 systemd-udevd[244394]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:35:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:20Z|00123|binding|INFO|Setting lport 78b6999c-7e47-4732-a9f5-e2b099b471f9 ovn-installed in OVS
Jan 20 09:35:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:20Z|00124|binding|INFO|Setting lport 78b6999c-7e47-4732-a9f5-e2b099b471f9 up in Southbound
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.022 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.043 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[404aae93-6ee8-4e99-9af7-4a5cb962f276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.046 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5789a36d-8a58-4394-9b85-920a419827ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 NetworkManager[49076]: <info>  [1768919720.0664] device (tapf9e083ff-a0): carrier: link connected
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.071 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ddddad08-0c8d-4579-a80f-6ee0a48070ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.088 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5f3ecf-0629-48ad-9dfe-3c5bbda06689]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9e083ff-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:65:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476175, 'reachable_time': 24320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244418, 'error': None, 'target': 'ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.101 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f81868-500d-41ae-81e0-0b434ab2cb37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:657c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476175, 'tstamp': 476175}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244419, 'error': None, 'target': 'ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.114 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8dabbaee-6296-4662-adae-6ba03e18ed83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9e083ff-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:65:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476175, 'reachable_time': 24320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244420, 'error': None, 'target': 'ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.144 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[72e329db-81de-47d6-9c15-3e0146b800c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.158 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.215 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ed44ca04-383c-4fe6-8250-7b190c60d583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.217 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9e083ff-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.217 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.217 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9e083ff-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.219 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 NetworkManager[49076]: <info>  [1768919720.2201] manager: (tapf9e083ff-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 20 09:35:20 np0005588920 kernel: tapf9e083ff-a0: entered promiscuous mode
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.229 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9e083ff-a0, col_values=(('external_ids', {'iface-id': '7bd2b71c-c83d-4ad0-a211-547e3f33c14e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:20Z|00125|binding|INFO|Releasing lport 7bd2b71c-c83d-4ad0-a211-547e3f33c14e from this chassis (sb_readonly=0)
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.244 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.245 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9e083ff-afa9-4e67-8c9b-f18349d0d534.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9e083ff-afa9-4e67-8c9b-f18349d0d534.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.246 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1e400a-c760-4a1e-ae8b-1f82951c6679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.247 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-f9e083ff-afa9-4e67-8c9b-f18349d0d534
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/f9e083ff-afa9-4e67-8c9b-f18349d0d534.pid.haproxy
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID f9e083ff-afa9-4e67-8c9b-f18349d0d534
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:35:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:20.248 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'env', 'PROCESS_TAG=haproxy-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9e083ff-afa9-4e67-8c9b-f18349d0d534.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.253 226890 DEBUG nova.compute.manager [req-6f61f507-b37e-4eea-94f1-a50e17310fe3 req-0ed9ba34-4fbf-4c06-9c56-bbb4c71ddc07 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.253 226890 DEBUG oslo_concurrency.lockutils [req-6f61f507-b37e-4eea-94f1-a50e17310fe3 req-0ed9ba34-4fbf-4c06-9c56-bbb4c71ddc07 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.254 226890 DEBUG oslo_concurrency.lockutils [req-6f61f507-b37e-4eea-94f1-a50e17310fe3 req-0ed9ba34-4fbf-4c06-9c56-bbb4c71ddc07 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.254 226890 DEBUG oslo_concurrency.lockutils [req-6f61f507-b37e-4eea-94f1-a50e17310fe3 req-0ed9ba34-4fbf-4c06-9c56-bbb4c71ddc07 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.254 226890 DEBUG nova.compute.manager [req-6f61f507-b37e-4eea-94f1-a50e17310fe3 req-0ed9ba34-4fbf-4c06-9c56-bbb4c71ddc07 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Processing event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:35:20 np0005588920 podman[244470]: 2026-01-20 14:35:20.607126321 +0000 UTC m=+0.051148305 container create d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:35:20 np0005588920 systemd[1]: Started libpod-conmon-d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73.scope.
Jan 20 09:35:20 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:35:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cc219a76b7374b0c36019dc2f14bc3241e3299f8dc6403f5b94c0307a01ca22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:35:20 np0005588920 podman[244470]: 2026-01-20 14:35:20.584394223 +0000 UTC m=+0.028416187 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:35:20 np0005588920 podman[244470]: 2026-01-20 14:35:20.689724317 +0000 UTC m=+0.133746301 container init d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 09:35:20 np0005588920 podman[244470]: 2026-01-20 14:35:20.69661384 +0000 UTC m=+0.140635774 container start d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:35:20 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [NOTICE]   (244489) : New worker (244491) forked
Jan 20 09:35:20 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [NOTICE]   (244489) : Loading success.
Jan 20 09:35:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:20.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.952 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919720.9514983, 06711d06-2cb8-4aa0-a787-db6d71e98029 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.952 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] VM Started (Lifecycle Event)#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.956 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.960 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.964 226890 INFO nova.virt.libvirt.driver [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Instance spawned successfully.#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.964 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.992 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:20 np0005588920 nova_compute[226886]: 2026-01-20 14:35:20.998 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.000 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.001 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.001 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.002 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.002 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.002 226890 DEBUG nova.virt.libvirt.driver [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.014 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.014 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919720.9516015, 06711d06-2cb8-4aa0-a787-db6d71e98029 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.015 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.039 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:21.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.042 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919720.9595497, 06711d06-2cb8-4aa0-a787-db6d71e98029 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.042 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.061 226890 INFO nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Took 7.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.061 226890 DEBUG nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.067 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.070 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.102 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.124 226890 INFO nova.compute.manager [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Took 8.77 seconds to build instance.#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.144 226890 DEBUG oslo_concurrency.lockutils [None req-6b1b0d4b-d1dc-4769-b3ed-349d037f5c16 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.960 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.960 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.960 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:35:21 np0005588920 nova_compute[226886]: 2026-01-20 14:35:21.961 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c9a86cb2-b092-4887-b47d-1a05fb756a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.376 226890 DEBUG nova.compute.manager [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.376 226890 DEBUG oslo_concurrency.lockutils [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.377 226890 DEBUG oslo_concurrency.lockutils [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.378 226890 DEBUG oslo_concurrency.lockutils [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.378 226890 DEBUG nova.compute.manager [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] No waiting events found dispatching network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:22 np0005588920 nova_compute[226886]: 2026-01-20 14:35:22.378 226890 WARNING nova.compute.manager [req-8f65b234-eedf-43ca-a123-185e4f7f029b req-fc284867-30be-4bdc-8dc0-208a758437b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received unexpected event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:35:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 20 09:35:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:22.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:23.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.144 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.145 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.145 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.146 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.146 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.148 226890 INFO nova.compute.manager [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Terminating instance#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.149 226890 DEBUG nova.compute.manager [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:23 np0005588920 kernel: tap78b6999c-7e (unregistering): left promiscuous mode
Jan 20 09:35:23 np0005588920 NetworkManager[49076]: <info>  [1768919723.1998] device (tap78b6999c-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:23Z|00126|binding|INFO|Releasing lport 78b6999c-7e47-4732-a9f5-e2b099b471f9 from this chassis (sb_readonly=0)
Jan 20 09:35:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:23Z|00127|binding|INFO|Setting lport 78b6999c-7e47-4732-a9f5-e2b099b471f9 down in Southbound
Jan 20 09:35:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:23Z|00128|binding|INFO|Removing iface tap78b6999c-7e ovn-installed in OVS
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.219 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:df:e0 10.100.0.4'], port_security=['fa:16:3e:96:df:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '06711d06-2cb8-4aa0-a787-db6d71e98029', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a73bd836c7f64377a24971d95d583639', 'neutron:revision_number': '4', 'neutron:security_group_ids': '84362096-42e2-4bb5-9db5-ce10e58087a3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70f705e1-8f62-4abd-8cf0-07de3d16048d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=78b6999c-7e47-4732-a9f5-e2b099b471f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.222 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 78b6999c-7e47-4732-a9f5-e2b099b471f9 in datapath f9e083ff-afa9-4e67-8c9b-f18349d0d534 unbound from our chassis#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.225 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9e083ff-afa9-4e67-8c9b-f18349d0d534, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.226 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[97fdecff-e5f6-4ce0-b6fa-e9c740d9d8cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.227 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534 namespace which is not needed anymore#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 20 09:35:23 np0005588920 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000030.scope: Consumed 2.867s CPU time.
Jan 20 09:35:23 np0005588920 systemd-machined[196121]: Machine qemu-21-instance-00000030 terminated.
Jan 20 09:35:23 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [NOTICE]   (244489) : haproxy version is 2.8.14-c23fe91
Jan 20 09:35:23 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [NOTICE]   (244489) : path to executable is /usr/sbin/haproxy
Jan 20 09:35:23 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [WARNING]  (244489) : Exiting Master process...
Jan 20 09:35:23 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [ALERT]    (244489) : Current worker (244491) exited with code 143 (Terminated)
Jan 20 09:35:23 np0005588920 neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534[244485]: [WARNING]  (244489) : All workers exited. Exiting... (0)
Jan 20 09:35:23 np0005588920 systemd[1]: libpod-d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73.scope: Deactivated successfully.
Jan 20 09:35:23 np0005588920 podman[244547]: 2026-01-20 14:35:23.357124381 +0000 UTC m=+0.044244971 container died d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.386 226890 INFO nova.virt.libvirt.driver [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Instance destroyed successfully.#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.387 226890 DEBUG nova.objects.instance [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lazy-loading 'resources' on Instance uuid 06711d06-2cb8-4aa0-a787-db6d71e98029 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:23 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73-userdata-shm.mount: Deactivated successfully.
Jan 20 09:35:23 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0cc219a76b7374b0c36019dc2f14bc3241e3299f8dc6403f5b94c0307a01ca22-merged.mount: Deactivated successfully.
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.403 226890 DEBUG nova.virt.libvirt.vif [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:35:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1396811540',display_name='tempest-ImagesNegativeTestJSON-server-1396811540',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1396811540',id=48,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:35:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a73bd836c7f64377a24971d95d583639',ramdisk_id='',reservation_id='r-83m77gox',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-1238318859',owner_user_name='tempest-ImagesNegativeTestJSON-1238318859-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:35:21Z,user_data=None,user_id='6bf23282febb455daf4d4f24666cd6c3',uuid=06711d06-2cb8-4aa0-a787-db6d71e98029,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.404 226890 DEBUG nova.network.os_vif_util [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converting VIF {"id": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "address": "fa:16:3e:96:df:e0", "network": {"id": "f9e083ff-afa9-4e67-8c9b-f18349d0d534", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1787894460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a73bd836c7f64377a24971d95d583639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78b6999c-7e", "ovs_interfaceid": "78b6999c-7e47-4732-a9f5-e2b099b471f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.405 226890 DEBUG nova.network.os_vif_util [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.405 226890 DEBUG os_vif [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.407 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.407 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap78b6999c-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.408 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 podman[244547]: 2026-01-20 14:35:23.409614183 +0000 UTC m=+0.096734733 container cleanup d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.415 226890 INFO os_vif [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:df:e0,bridge_name='br-int',has_traffic_filtering=True,id=78b6999c-7e47-4732-a9f5-e2b099b471f9,network=Network(f9e083ff-afa9-4e67-8c9b-f18349d0d534),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78b6999c-7e')#033[00m
Jan 20 09:35:23 np0005588920 systemd[1]: libpod-conmon-d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73.scope: Deactivated successfully.
Jan 20 09:35:23 np0005588920 podman[244596]: 2026-01-20 14:35:23.490516581 +0000 UTC m=+0.045597499 container remove d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.497 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[862c3866-23bb-42e6-91d5-465bcb89ff92]: (4, ('Tue Jan 20 02:35:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534 (d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73)\nd481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73\nTue Jan 20 02:35:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534 (d481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73)\nd481c8ce43df3f0ff3b6142c5d4f6fabe088f262575a54b80ae57fad9b711a73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.499 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[992ab445-c4eb-4115-844c-7612a8ef6b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.500 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9e083ff-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.502 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 kernel: tapf9e083ff-a0: left promiscuous mode
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.521 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17a11908-f27e-42ea-9f7a-7efd03ae1e2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.535 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9317bd1e-0958-4f77-b65f-0bce9988fdeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.537 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d06ff716-7747-4182-9726-461b1643b382]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.556 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[867476ce-5d8f-4bc1-bde6-16a445e6b3c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476169, 'reachable_time': 44266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244622, 'error': None, 'target': 'ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 systemd[1]: run-netns-ovnmeta\x2df9e083ff\x2dafa9\x2d4e67\x2d8c9b\x2df18349d0d534.mount: Deactivated successfully.
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.563 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9e083ff-afa9-4e67-8c9b-f18349d0d534 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:35:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:23.563 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[197d937f-c24f-4f86-b491-2c05ec018e00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.637 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updating instance_info_cache with network_info: [{"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.660 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-c9a86cb2-b092-4887-b47d-1a05fb756a83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.660 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.660 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.964 226890 INFO nova.virt.libvirt.driver [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Deleting instance files /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029_del#033[00m
Jan 20 09:35:23 np0005588920 nova_compute[226886]: 2026-01-20 14:35:23.965 226890 INFO nova.virt.libvirt.driver [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Deletion of /var/lib/nova/instances/06711d06-2cb8-4aa0-a787-db6d71e98029_del complete#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.011 226890 INFO nova.compute.manager [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.012 226890 DEBUG oslo.service.loopingcall [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.012 226890 DEBUG nova.compute.manager [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.012 226890 DEBUG nova.network.neutron [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.463 226890 DEBUG nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-unplugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.464 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.465 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.465 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.466 226890 DEBUG nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] No waiting events found dispatching network-vif-unplugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.467 226890 DEBUG nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-unplugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.467 226890 DEBUG nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.468 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.469 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.469 226890 DEBUG oslo_concurrency.lockutils [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.469 226890 DEBUG nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] No waiting events found dispatching network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.469 226890 WARNING nova.compute.manager [req-9cf093fa-6e3d-4c36-88f7-094489c30281 req-e06595bb-4dc9-48fc-86de-e489d0a05b50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received unexpected event network-vif-plugged-78b6999c-7e47-4732-a9f5-e2b099b471f9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.573 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:24.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.795 226890 DEBUG nova.network.neutron [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.819 226890 INFO nova.compute.manager [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Took 0.81 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.875 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.876 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 20 09:35:24 np0005588920 nova_compute[226886]: 2026-01-20 14:35:24.957 226890 DEBUG oslo_concurrency.processutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:25.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/979848207' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.376 226890 DEBUG nova.compute.manager [req-ce157098-14a3-4246-a8a5-9ba836d4ba8d req-a454e3f3-3543-47c3-8395-2d778f973914 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Received event network-vif-deleted-78b6999c-7e47-4732-a9f5-e2b099b471f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.379 226890 DEBUG oslo_concurrency.processutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.384 226890 DEBUG nova.compute.provider_tree [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.398 226890 DEBUG nova.scheduler.client.report [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.425 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.448 226890 INFO nova.scheduler.client.report [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Deleted allocations for instance 06711d06-2cb8-4aa0-a787-db6d71e98029#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.518 226890 DEBUG oslo_concurrency.lockutils [None req-efc9f19c-4125-4d2b-b6ec-5e4f904a3295 6bf23282febb455daf4d4f24666cd6c3 a73bd836c7f64377a24971d95d583639 - - default default] Lock "06711d06-2cb8-4aa0-a787-db6d71e98029" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.902 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.903 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.903 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.904 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.904 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.905 226890 INFO nova.compute.manager [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Terminating instance#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.907 226890 DEBUG nova.compute.manager [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:25 np0005588920 kernel: tap7a9a2efa-73 (unregistering): left promiscuous mode
Jan 20 09:35:25 np0005588920 NetworkManager[49076]: <info>  [1768919725.9527] device (tap7a9a2efa-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 20 09:35:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:25Z|00129|binding|INFO|Releasing lport 7a9a2efa-73d4-41be-92ee-61654388a2b1 from this chassis (sb_readonly=0)
Jan 20 09:35:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:25Z|00130|binding|INFO|Setting lport 7a9a2efa-73d4-41be-92ee-61654388a2b1 down in Southbound
Jan 20 09:35:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:25Z|00131|binding|INFO|Removing iface tap7a9a2efa-73 ovn-installed in OVS
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.962 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:25.968 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:fe:a5 10.100.0.4'], port_security=['fa:16:3e:67:fe:a5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9a86cb2-b092-4887-b47d-1a05fb756a83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7a9a2efa-73d4-41be-92ee-61654388a2b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:25.969 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7a9a2efa-73d4-41be-92ee-61654388a2b1 in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a unbound from our chassis#033[00m
Jan 20 09:35:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:25.972 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:35:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:25.973 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b90bc769-6cb4-472d-91a3-c5bb6f4adf27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:25.974 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a namespace which is not needed anymore#033[00m
Jan 20 09:35:25 np0005588920 nova_compute[226886]: 2026-01-20 14:35:25.984 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:26 np0005588920 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 20 09:35:26 np0005588920 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000028.scope: Consumed 16.566s CPU time.
Jan 20 09:35:26 np0005588920 systemd-machined[196121]: Machine qemu-18-instance-00000028 terminated.
Jan 20 09:35:26 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [NOTICE]   (242439) : haproxy version is 2.8.14-c23fe91
Jan 20 09:35:26 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [NOTICE]   (242439) : path to executable is /usr/sbin/haproxy
Jan 20 09:35:26 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [WARNING]  (242439) : Exiting Master process...
Jan 20 09:35:26 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [ALERT]    (242439) : Current worker (242441) exited with code 143 (Terminated)
Jan 20 09:35:26 np0005588920 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[242435]: [WARNING]  (242439) : All workers exited. Exiting... (0)
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.147 226890 INFO nova.virt.libvirt.driver [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Instance destroyed successfully.#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.147 226890 DEBUG nova.objects.instance [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'resources' on Instance uuid c9a86cb2-b092-4887-b47d-1a05fb756a83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:26 np0005588920 systemd[1]: libpod-91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811.scope: Deactivated successfully.
Jan 20 09:35:26 np0005588920 podman[244667]: 2026-01-20 14:35:26.1537651 +0000 UTC m=+0.052197645 container died 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.166 226890 DEBUG nova.virt.libvirt.vif [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:33:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2105937994',display_name='tempest-ServersAdminTestJSON-server-2105937994',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2105937994',id=40,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:33:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-1bk8rulu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:33:48Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=c9a86cb2-b092-4887-b47d-1a05fb756a83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.167 226890 DEBUG nova.network.os_vif_util [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "address": "fa:16:3e:67:fe:a5", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a9a2efa-73", "ovs_interfaceid": "7a9a2efa-73d4-41be-92ee-61654388a2b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.168 226890 DEBUG nova.network.os_vif_util [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.168 226890 DEBUG os_vif [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.170 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.171 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a9a2efa-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.174 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.175 226890 INFO os_vif [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:fe:a5,bridge_name='br-int',has_traffic_filtering=True,id=7a9a2efa-73d4-41be-92ee-61654388a2b1,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a9a2efa-73')#033[00m
Jan 20 09:35:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811-userdata-shm.mount: Deactivated successfully.
Jan 20 09:35:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay-805c9a9b81947f21b33ab479b627dfe2bcab8782966047413eac7a75abad9f31-merged.mount: Deactivated successfully.
Jan 20 09:35:26 np0005588920 podman[244667]: 2026-01-20 14:35:26.192212058 +0000 UTC m=+0.090644593 container cleanup 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:35:26 np0005588920 systemd[1]: libpod-conmon-91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811.scope: Deactivated successfully.
Jan 20 09:35:26 np0005588920 podman[244722]: 2026-01-20 14:35:26.261114929 +0000 UTC m=+0.044089937 container remove 91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.265 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b82541dc-49cb-4924-a3df-f8f0727d0fef]: (4, ('Tue Jan 20 02:35:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a (91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811)\n91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811\nTue Jan 20 02:35:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a (91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811)\n91a85c9d4b6ef20c39121a389ad5daacc90287436aee76b40ecef6064fcbf811\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.267 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dce48a73-e676-460f-aa1b-507ebdc5b2e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.268 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:26 np0005588920 kernel: tap33c9a20a-d0: left promiscuous mode
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.290 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.294 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fc911171-fcc2-4b3f-956c-125856ac5097]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.308 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9df0be3e-554f-4a06-9a39-23241d17a7f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.309 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d27e6c1-1bd4-467f-9ae8-09c26f09d99b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.324 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7cec5d69-f469-4923-a9c5-2d2fecac6932]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466654, 'reachable_time': 41244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244739, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.326 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:35:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:26.327 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[14b80900-85a0-4282-a3a9-8310ba6ce3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:26 np0005588920 systemd[1]: run-netns-ovnmeta\x2d33c9a20a\x2dd976\x2d42a8\x2db8bf\x2df83ddfc97c9a.mount: Deactivated successfully.
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.490 226890 DEBUG nova.virt.libvirt.driver [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.730 226890 INFO nova.virt.libvirt.driver [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Deleting instance files /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83_del#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.731 226890 INFO nova.virt.libvirt.driver [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Deletion of /var/lib/nova/instances/c9a86cb2-b092-4887-b47d-1a05fb756a83_del complete#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.734 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.743 226890 DEBUG nova.compute.manager [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-unplugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.744 226890 DEBUG oslo_concurrency.lockutils [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.744 226890 DEBUG oslo_concurrency.lockutils [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.745 226890 DEBUG oslo_concurrency.lockutils [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.745 226890 DEBUG nova.compute.manager [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] No waiting events found dispatching network-vif-unplugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.745 226890 DEBUG nova.compute.manager [req-ac21e656-8a80-45f6-9771-4cb0932e640f req-aee7937a-31cd-4a65-8bb1-8eb2d6ec1f61 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-unplugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:35:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:26.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.818 226890 INFO nova.compute.manager [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.819 226890 DEBUG oslo.service.loopingcall [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.820 226890 DEBUG nova.compute.manager [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:26 np0005588920 nova_compute[226886]: 2026-01-20 14:35:26.821 226890 DEBUG nova.network.neutron [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:27.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.436 226890 DEBUG nova.network.neutron [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.458 226890 INFO nova.compute.manager [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Took 0.64 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.493 226890 DEBUG nova.compute.manager [req-0f9a7ccb-e755-45bf-a3b2-d1cd9bdbd835 req-571c2951-999a-470f-9764-b157ec8453bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-deleted-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.551 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.552 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.620 226890 DEBUG oslo_concurrency.processutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:27 np0005588920 nova_compute[226886]: 2026-01-20 14:35:27.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:27 np0005588920 podman[244765]: 2026-01-20 14:35:27.994352454 +0000 UTC m=+0.076504206 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 09:35:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2919091173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.092 226890 DEBUG oslo_concurrency.processutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.099 226890 DEBUG nova.compute.provider_tree [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.116 226890 DEBUG nova.scheduler.client.report [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.137 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.140 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.140 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.141 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.141 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.201 226890 INFO nova.scheduler.client.report [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Deleted allocations for instance c9a86cb2-b092-4887-b47d-1a05fb756a83#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.294 226890 DEBUG oslo_concurrency.lockutils [None req-bed8263c-6ee8-4fa5-bf82-3ab29db205f1 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3279341162' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.621 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.681 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000002d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.681 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000002d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:28.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:28 np0005588920 kernel: tap3f3dec34-72 (unregistering): left promiscuous mode
Jan 20 09:35:28 np0005588920 NetworkManager[49076]: <info>  [1768919728.8358] device (tap3f3dec34-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:28Z|00132|binding|INFO|Releasing lport 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef from this chassis (sb_readonly=0)
Jan 20 09:35:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:28Z|00133|binding|INFO|Setting lport 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef down in Southbound
Jan 20 09:35:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:28Z|00134|binding|INFO|Removing iface tap3f3dec34-72 ovn-installed in OVS
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.843 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:28.849 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:83:16 10.100.0.6'], port_security=['fa:16:3e:e9:83:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e22c5447-900e-45da-b2af-46423bc1e2d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:28.850 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 3f3dec34-7223-4718-8c4b-bf4fdfddf3ef in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a unbound from our chassis#033[00m
Jan 20 09:35:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:28.851 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network abb83e3e-0b12-431b-ad86-a1d271b5b46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:35:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:28.852 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cdaf90f5-fb98-452b-929a-3c33f5f8956c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:28.852 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace which is not needed anymore#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.853 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.854 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4453MB free_disk=20.804668426513672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.854 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.854 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.861 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:28 np0005588920 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 20 09:35:28 np0005588920 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Consumed 14.149s CPU time.
Jan 20 09:35:28 np0005588920 systemd-machined[196121]: Machine qemu-20-instance-0000002d terminated.
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.903 226890 DEBUG nova.compute.manager [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.904 226890 DEBUG oslo_concurrency.lockutils [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.904 226890 DEBUG oslo_concurrency.lockutils [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.904 226890 DEBUG oslo_concurrency.lockutils [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c9a86cb2-b092-4887-b47d-1a05fb756a83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.904 226890 DEBUG nova.compute.manager [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] No waiting events found dispatching network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.904 226890 WARNING nova.compute.manager [req-bad7ea4c-3a95-49c1-8fd7-9ffd7d50a5fc req-ba519c55-41fb-4e59-b921-7df5c7ab715a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Received unexpected event network-vif-plugged-7a9a2efa-73d4-41be-92ee-61654388a2b1 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.937 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance e22c5447-900e-45da-b2af-46423bc1e2d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.938 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.938 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:35:28 np0005588920 nova_compute[226886]: 2026-01-20 14:35:28.972 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:29.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:29 np0005588920 NetworkManager[49076]: <info>  [1768919729.0606] manager: (tap3f3dec34-72): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 20 09:35:29 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [NOTICE]   (243978) : haproxy version is 2.8.14-c23fe91
Jan 20 09:35:29 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [NOTICE]   (243978) : path to executable is /usr/sbin/haproxy
Jan 20 09:35:29 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [WARNING]  (243978) : Exiting Master process...
Jan 20 09:35:29 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [ALERT]    (243978) : Current worker (243980) exited with code 143 (Terminated)
Jan 20 09:35:29 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[243974]: [WARNING]  (243978) : All workers exited. Exiting... (0)
Jan 20 09:35:29 np0005588920 systemd[1]: libpod-f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1.scope: Deactivated successfully.
Jan 20 09:35:29 np0005588920 podman[244832]: 2026-01-20 14:35:29.078419448 +0000 UTC m=+0.137011412 container died f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:35:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:35:29Z|00135|binding|INFO|Releasing lport dfacaf19-f896-4c13-a7ad-47b57cf03fc1 from this chassis (sb_readonly=0)
Jan 20 09:35:29 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1-userdata-shm.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588920 systemd[1]: var-lib-containers-storage-overlay-14bc360de7ee8c5f67acdc039ed29aeb0f1f5df9711cad7e48c743a2ed340cd3-merged.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588920 podman[244832]: 2026-01-20 14:35:29.146769824 +0000 UTC m=+0.205361848 container cleanup f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:35:29 np0005588920 systemd[1]: libpod-conmon-f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1.scope: Deactivated successfully.
Jan 20 09:35:29 np0005588920 podman[244889]: 2026-01-20 14:35:29.204793401 +0000 UTC m=+0.038184361 container remove f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.209 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f43ca68-c787-40a2-a290-73191280df9f]: (4, ('Tue Jan 20 02:35:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1)\nf4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1\nTue Jan 20 02:35:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (f4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1)\nf4555188deb4bb8df8edaeb4d7c95d78707ca49340e72b887fe9e093b4270ba1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.211 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9724a55e-f93f-42cb-b80e-91a20d099c88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.212 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588920 kernel: tapabb83e3e-00: left promiscuous mode
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.284 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[95ffa1de-40b5-43fc-b428-211f249fded6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.301 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.304 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e29a3209-d9c1-4c83-90bc-d0e92c032efd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.305 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2abdeb3a-f242-4a84-8168-8cc6d8b86e7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.314 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.321 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[edf02c83-0f78-41a0-9ee5-46e08ba9de06]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474469, 'reachable_time': 21814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244907, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 systemd[1]: run-netns-ovnmeta\x2dabb83e3e\x2d0b12\x2d431b\x2dad86\x2da1d271b5b46a.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.323 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:35:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:29.323 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c04a30fc-b7e2-4a08-ab56-c02af063fef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3897833887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.425 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.429 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.445 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.465 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.465 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.507 226890 INFO nova.virt.libvirt.driver [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance shutdown successfully after 24 seconds.#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.511 226890 INFO nova.virt.libvirt.driver [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance destroyed successfully.#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.511 226890 DEBUG nova.objects.instance [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'numa_topology' on Instance uuid e22c5447-900e-45da-b2af-46423bc1e2d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.525 226890 DEBUG nova.compute.manager [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.571 226890 DEBUG oslo_concurrency.lockutils [None req-ca3b75eb-76a2-480b-8a80-c774c0654d58 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.595 226890 DEBUG nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-vif-unplugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.595 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.595 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.595 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.595 226890 DEBUG nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] No waiting events found dispatching network-vif-unplugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 WARNING nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received unexpected event network-vif-unplugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 DEBUG nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 DEBUG oslo_concurrency.lockutils [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.596 226890 DEBUG nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] No waiting events found dispatching network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:29 np0005588920 nova_compute[226886]: 2026-01-20 14:35:29.597 226890 WARNING nova.compute.manager [req-80b49d01-1f2a-4e59-883c-cf79f2f75b58 req-769e1dbf-fcb0-4073-b302-ea9b511a4a8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received unexpected event network-vif-plugged-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:35:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:30.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:31.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:31 np0005588920 nova_compute[226886]: 2026-01-20 14:35:31.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:32 np0005588920 nova_compute[226886]: 2026-01-20 14:35:32.363 226890 DEBUG nova.compute.manager [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:32 np0005588920 nova_compute[226886]: 2026-01-20 14:35:32.398 226890 INFO nova.compute.manager [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] instance snapshotting#033[00m
Jan 20 09:35:32 np0005588920 nova_compute[226886]: 2026-01-20 14:35:32.398 226890 WARNING nova.compute.manager [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Jan 20 09:35:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 20 09:35:32 np0005588920 nova_compute[226886]: 2026-01-20 14:35:32.659 226890 INFO nova.virt.libvirt.driver [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Beginning cold snapshot process#033[00m
Jan 20 09:35:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:32.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:32 np0005588920 nova_compute[226886]: 2026-01-20 14:35:32.820 226890 DEBUG nova.virt.libvirt.imagebackend [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:35:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:33.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:33 np0005588920 nova_compute[226886]: 2026-01-20 14:35:33.164 226890 DEBUG nova.storage.rbd_utils [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(f8306fabadc64c25b85d6628f0fd4daa) on rbd image(e22c5447-900e-45da-b2af-46423bc1e2d8_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 20 09:35:33 np0005588920 nova_compute[226886]: 2026-01-20 14:35:33.630 226890 DEBUG nova.storage.rbd_utils [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] cloning vms/e22c5447-900e-45da-b2af-46423bc1e2d8_disk@f8306fabadc64c25b85d6628f0fd4daa to images/656ef5b8-a536-420e-a7a3-850a33da38a8 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:35:33 np0005588920 nova_compute[226886]: 2026-01-20 14:35:33.771 226890 DEBUG nova.storage.rbd_utils [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] flattening images/656ef5b8-a536-420e-a7a3-850a33da38a8 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:35:34 np0005588920 nova_compute[226886]: 2026-01-20 14:35:34.157 226890 DEBUG nova.storage.rbd_utils [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] removing snapshot(f8306fabadc64c25b85d6628f0fd4daa) on rbd image(e22c5447-900e-45da-b2af-46423bc1e2d8_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:35:34 np0005588920 nova_compute[226886]: 2026-01-20 14:35:34.577 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 20 09:35:34 np0005588920 nova_compute[226886]: 2026-01-20 14:35:34.617 226890 DEBUG nova.storage.rbd_utils [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(snap) on rbd image(656ef5b8-a536-420e-a7a3-850a33da38a8) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:34.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:35.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 20 09:35:36 np0005588920 nova_compute[226886]: 2026-01-20 14:35:36.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:36.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 20 09:35:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:37.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:37 np0005588920 nova_compute[226886]: 2026-01-20 14:35:37.390 226890 INFO nova.virt.libvirt.driver [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Snapshot image upload complete#033[00m
Jan 20 09:35:37 np0005588920 nova_compute[226886]: 2026-01-20 14:35:37.391 226890 INFO nova.compute.manager [None req-94fdf283-dcde-4ba3-a144-d4d5d431814d 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Took 4.99 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:35:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 20 09:35:38 np0005588920 nova_compute[226886]: 2026-01-20 14:35:38.383 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919723.3830101, 06711d06-2cb8-4aa0-a787-db6d71e98029 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:38 np0005588920 nova_compute[226886]: 2026-01-20 14:35:38.384 226890 INFO nova.compute.manager [-] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:38 np0005588920 nova_compute[226886]: 2026-01-20 14:35:38.413 226890 DEBUG nova.compute.manager [None req-c0dca3a9-027f-43d2-a5da-298aca3a99e8 - - - - - -] [instance: 06711d06-2cb8-4aa0-a787-db6d71e98029] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:38.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 20 09:35:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:39.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:39 np0005588920 nova_compute[226886]: 2026-01-20 14:35:39.579 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.777 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.777 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.778 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.778 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.778 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.780 226890 INFO nova.compute.manager [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Terminating instance#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.781 226890 DEBUG nova.compute.manager [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:40.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.789 226890 INFO nova.virt.libvirt.driver [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Instance destroyed successfully.#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.790 226890 DEBUG nova.objects.instance [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'resources' on Instance uuid e22c5447-900e-45da-b2af-46423bc1e2d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.804 226890 DEBUG nova.virt.libvirt.vif [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1596237958',display_name='tempest-ImagesTestJSON-server-1596237958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1596237958',id=45,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:35:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-3za8g4v3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:35:37Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=e22c5447-900e-45da-b2af-46423bc1e2d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.805 226890 DEBUG nova.network.os_vif_util [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "address": "fa:16:3e:e9:83:16", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f3dec34-72", "ovs_interfaceid": "3f3dec34-7223-4718-8c4b-bf4fdfddf3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.805 226890 DEBUG nova.network.os_vif_util [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.806 226890 DEBUG os_vif [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.807 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.808 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3dec34-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.809 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.811 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:40 np0005588920 nova_compute[226886]: 2026-01-20 14:35:40.813 226890 INFO os_vif [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:83:16,bridge_name='br-int',has_traffic_filtering=True,id=3f3dec34-7223-4718-8c4b-bf4fdfddf3ef,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f3dec34-72')#033[00m
Jan 20 09:35:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:41.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.146 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919726.1448133, c9a86cb2-b092-4887-b47d-1a05fb756a83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.146 226890 INFO nova.compute.manager [-] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.168 226890 DEBUG nova.compute.manager [None req-559f7d9a-9e73-4bfd-8d52-4ddb54fd74ce - - - - - -] [instance: c9a86cb2-b092-4887-b47d-1a05fb756a83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.192 226890 INFO nova.virt.libvirt.driver [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Deleting instance files /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8_del#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.193 226890 INFO nova.virt.libvirt.driver [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Deletion of /var/lib/nova/instances/e22c5447-900e-45da-b2af-46423bc1e2d8_del complete#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.243 226890 INFO nova.compute.manager [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.244 226890 DEBUG oslo.service.loopingcall [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.244 226890 DEBUG nova.compute.manager [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:41 np0005588920 nova_compute[226886]: 2026-01-20 14:35:41.244 226890 DEBUG nova.network.neutron [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.461 226890 DEBUG nova.network.neutron [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.477 226890 INFO nova.compute.manager [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Took 1.23 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.536 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.536 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.571 226890 DEBUG nova.compute.manager [req-7d110ef3-7bd6-421c-9839-2562f309c4c9 req-919f872e-16b1-4ecb-814f-f2eeb6e6ecac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Received event network-vif-deleted-3f3dec34-7223-4718-8c4b-bf4fdfddf3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:42 np0005588920 nova_compute[226886]: 2026-01-20 14:35:42.600 226890 DEBUG oslo_concurrency.processutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 20 09:35:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:42.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:43.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1152927409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.662 226890 DEBUG oslo_concurrency.processutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.670 226890 DEBUG nova.compute.provider_tree [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.693 226890 DEBUG nova.scheduler.client.report [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.714 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.736 226890 INFO nova.scheduler.client.report [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Deleted allocations for instance e22c5447-900e-45da-b2af-46423bc1e2d8#033[00m
Jan 20 09:35:43 np0005588920 nova_compute[226886]: 2026-01-20 14:35:43.810 226890 DEBUG oslo_concurrency.lockutils [None req-b51e87fc-a094-4dd8-a720-66ca31de57dc 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "e22c5447-900e-45da-b2af-46423bc1e2d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:44 np0005588920 nova_compute[226886]: 2026-01-20 14:35:44.078 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919729.0766187, e22c5447-900e-45da-b2af-46423bc1e2d8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:44 np0005588920 nova_compute[226886]: 2026-01-20 14:35:44.080 226890 INFO nova.compute.manager [-] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:44 np0005588920 nova_compute[226886]: 2026-01-20 14:35:44.581 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:44 np0005588920 nova_compute[226886]: 2026-01-20 14:35:44.762 226890 DEBUG nova.compute.manager [None req-ca7b0616-4efc-4508-8be3-d60358d680bd - - - - - -] [instance: e22c5447-900e-45da-b2af-46423bc1e2d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:44.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:45.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:45 np0005588920 nova_compute[226886]: 2026-01-20 14:35:45.836 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:46 np0005588920 podman[245093]: 2026-01-20 14:35:46.037230908 +0000 UTC m=+0.100333554 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:35:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:46.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:47.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.559157) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747559358, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2550, "num_deletes": 263, "total_data_size": 5588245, "memory_usage": 5646280, "flush_reason": "Manual Compaction"}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747637862, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3669107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30895, "largest_seqno": 33440, "table_properties": {"data_size": 3658662, "index_size": 6683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22357, "raw_average_key_size": 21, "raw_value_size": 3637541, "raw_average_value_size": 3434, "num_data_blocks": 288, "num_entries": 1059, "num_filter_entries": 1059, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919562, "oldest_key_time": 1768919562, "file_creation_time": 1768919747, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 78794 microseconds, and 14049 cpu microseconds.
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.637947) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3669107 bytes OK
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.637979) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.641412) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.641446) EVENT_LOG_v1 {"time_micros": 1768919747641436, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.641473) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5576888, prev total WAL file size 5578587, number of live WAL files 2.
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.643888) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3583KB)], [60(8254KB)]
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747643982, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 12121314, "oldest_snapshot_seqno": -1}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5967 keys, 10180571 bytes, temperature: kUnknown
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747757702, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10180571, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10139702, "index_size": 24823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14981, "raw_key_size": 151229, "raw_average_key_size": 25, "raw_value_size": 10031457, "raw_average_value_size": 1681, "num_data_blocks": 1003, "num_entries": 5967, "num_filter_entries": 5967, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919747, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.757913) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10180571 bytes
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.759948) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.5 rd, 89.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.1 +0.0 blob) out(9.7 +0.0 blob), read-write-amplify(6.1) write-amplify(2.8) OK, records in: 6503, records dropped: 536 output_compression: NoCompression
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.759968) EVENT_LOG_v1 {"time_micros": 1768919747759956, "job": 36, "event": "compaction_finished", "compaction_time_micros": 113782, "compaction_time_cpu_micros": 48547, "output_level": 6, "num_output_files": 1, "total_output_size": 10180571, "num_input_records": 6503, "num_output_records": 5967, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.643742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.760086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.760092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.760094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.760096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:35:47.760098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747761034, "job": 0, "event": "table_file_deletion", "file_number": 62}
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747762925, "job": 0, "event": "table_file_deletion", "file_number": 60}
Jan 20 09:35:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 20 09:35:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:48.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:49.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:49 np0005588920 nova_compute[226886]: 2026-01-20 14:35:49.583 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:49 np0005588920 nova_compute[226886]: 2026-01-20 14:35:49.935 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "5600f775-64ed-487b-9706-d5500878c3c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:49 np0005588920 nova_compute[226886]: 2026-01-20 14:35:49.935 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:49 np0005588920 nova_compute[226886]: 2026-01-20 14:35:49.950 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.038 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.038 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.045 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.045 226890 INFO nova.compute.claims [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.297 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/260268613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.742 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.748 226890 DEBUG nova.compute.provider_tree [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.767 226890 DEBUG nova.scheduler.client.report [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:50.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.794 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.795 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:35:50 np0005588920 nova_compute[226886]: 2026-01-20 14:35:50.839 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.016 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.017 226890 DEBUG nova.network.neutron [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:35:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:51.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.115 226890 INFO nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.330 226890 DEBUG nova.network.neutron [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.330 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.369 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.518 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.519 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.520 226890 INFO nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Creating image(s)#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.546 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.582 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.610 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.615 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.686 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.687 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.688 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.689 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.718 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.722 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5600f775-64ed-487b-9706-d5500878c3c0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 20 09:35:51 np0005588920 nova_compute[226886]: 2026-01-20 14:35:51.969 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5600f775-64ed-487b-9706-d5500878c3c0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.034 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] resizing rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.152 226890 DEBUG nova.objects.instance [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'migration_context' on Instance uuid 5600f775-64ed-487b-9706-d5500878c3c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.180 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.180 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Ensure instance console log exists: /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.181 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.181 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.181 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.183 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.188 226890 WARNING nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.194 226890 DEBUG nova.virt.libvirt.host [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.194 226890 DEBUG nova.virt.libvirt.host [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.199 226890 DEBUG nova.virt.libvirt.host [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.199 226890 DEBUG nova.virt.libvirt.host [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.200 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.200 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.200 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.201 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.201 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.201 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.201 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.201 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.202 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.202 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.202 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.202 226890 DEBUG nova.virt.hardware [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.205 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/635606860' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.601 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.630 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:52 np0005588920 nova_compute[226886]: 2026-01-20 14:35:52.636 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:52.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1932494090' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:53.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.096 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.099 226890 DEBUG nova.objects.instance [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5600f775-64ed-487b-9706-d5500878c3c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.198 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <uuid>5600f775-64ed-487b-9706-d5500878c3c0</uuid>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <name>instance-00000033</name>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-695045971</nova:name>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:35:52</nova:creationTime>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:user uuid="ecab37cbd7714ddd81e1db5b37ba85b3">tempest-ServersAdminNegativeTestJSON-1522974762-project-member</nova:user>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <nova:project uuid="b6594bd13c35449abc258d30a1a2509b">tempest-ServersAdminNegativeTestJSON-1522974762</nova:project>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="serial">5600f775-64ed-487b-9706-d5500878c3c0</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="uuid">5600f775-64ed-487b-9706-d5500878c3c0</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/5600f775-64ed-487b-9706-d5500878c3c0_disk">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/5600f775-64ed-487b-9706-d5500878c3c0_disk.config">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/console.log" append="off"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:35:53 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:35:53 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:35:53 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:35:53 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.275 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.275 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.276 226890 INFO nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Using config drive#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.302 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.611 226890 INFO nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Creating config drive at /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.616 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1me48onm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.744 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1me48onm" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.776 226890 DEBUG nova.storage.rbd_utils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image 5600f775-64ed-487b-9706-d5500878c3c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.780 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config 5600f775-64ed-487b-9706-d5500878c3c0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.935 226890 DEBUG oslo_concurrency.processutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config 5600f775-64ed-487b-9706-d5500878c3c0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:53 np0005588920 nova_compute[226886]: 2026-01-20 14:35:53.936 226890 INFO nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Deleting local config drive /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0/disk.config because it was imported into RBD.#033[00m
Jan 20 09:35:54 np0005588920 systemd-machined[196121]: New machine qemu-22-instance-00000033.
Jan 20 09:35:54 np0005588920 systemd[1]: Started Virtual Machine qemu-22-instance-00000033.
Jan 20 09:35:54 np0005588920 nova_compute[226886]: 2026-01-20 14:35:54.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:54.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:55.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.221 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919755.2202759, 5600f775-64ed-487b-9706-d5500878c3c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.221 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.227 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.228 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.233 226890 INFO nova.virt.libvirt.driver [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance spawned successfully.#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.233 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.251 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.260 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.267 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.267 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.268 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.269 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.269 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.270 226890 DEBUG nova.virt.libvirt.driver [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.279 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.280 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919755.226701, 5600f775-64ed-487b-9706-d5500878c3c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.280 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] VM Started (Lifecycle Event)#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.350 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.355 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.526 226890 INFO nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Took 4.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.527 226890 DEBUG nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.559 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.596 226890 INFO nova.compute.manager [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Took 5.58 seconds to build instance.#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.768 226890 DEBUG oslo_concurrency.lockutils [None req-b10d23e3-7b3a-4805-bab9-076a716eae47 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:55 np0005588920 nova_compute[226886]: 2026-01-20 14:35:55.842 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:56 np0005588920 nova_compute[226886]: 2026-01-20 14:35:56.049 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:56.049 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:35:56.051 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:35:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:57.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 20 09:35:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:58.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:58 np0005588920 podman[245485]: 2026-01-20 14:35:58.963966209 +0000 UTC m=+0.051364591 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:35:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:35:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:59.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:59 np0005588920 nova_compute[226886]: 2026-01-20 14:35:59.586 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:36:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:36:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:00.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:00 np0005588920 nova_compute[226886]: 2026-01-20 14:36:00.844 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:01.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 20 09:36:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 20 09:36:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:02.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:03.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 20 09:36:04 np0005588920 nova_compute[226886]: 2026-01-20 14:36:04.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:04.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:05.053 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 20 09:36:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:05.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:05 np0005588920 nova_compute[226886]: 2026-01-20 14:36:05.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 20 09:36:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:06.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:07.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 20 09:36:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:08.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 20 09:36:09 np0005588920 nova_compute[226886]: 2026-01-20 14:36:09.588 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:10.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:10 np0005588920 nova_compute[226886]: 2026-01-20 14:36:10.849 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:11.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 20 09:36:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:12Z|00136|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 20 09:36:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 20 09:36:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:12.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:14 np0005588920 nova_compute[226886]: 2026-01-20 14:36:14.591 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:14.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:15.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 20 09:36:15 np0005588920 nova_compute[226886]: 2026-01-20 14:36:15.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:16.438 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:16.439 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:16.439 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 20 09:36:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:16.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:17 np0005588920 podman[245690]: 2026-01-20 14:36:17.019089973 +0000 UTC m=+0.102464954 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:36:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:17.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.958 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "5600f775-64ed-487b-9706-d5500878c3c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.959 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.959 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "5600f775-64ed-487b-9706-d5500878c3c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.959 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.960 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.961 226890 INFO nova.compute.manager [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Terminating instance#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.961 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "refresh_cache-5600f775-64ed-487b-9706-d5500878c3c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.962 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquired lock "refresh_cache-5600f775-64ed-487b-9706-d5500878c3c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:17 np0005588920 nova_compute[226886]: 2026-01-20 14:36:17.962 226890 DEBUG nova.network.neutron [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:36:18 np0005588920 nova_compute[226886]: 2026-01-20 14:36:18.174 226890 DEBUG nova.network.neutron [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:18.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.243 226890 DEBUG nova.network.neutron [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.261 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Releasing lock "refresh_cache-5600f775-64ed-487b-9706-d5500878c3c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.262 226890 DEBUG nova.compute.manager [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:36:19 np0005588920 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 20 09:36:19 np0005588920 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000033.scope: Consumed 13.445s CPU time.
Jan 20 09:36:19 np0005588920 systemd-machined[196121]: Machine qemu-22-instance-00000033 terminated.
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.488 226890 INFO nova.virt.libvirt.driver [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance destroyed successfully.#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.489 226890 DEBUG nova.objects.instance [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'resources' on Instance uuid 5600f775-64ed-487b-9706-d5500878c3c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.902 226890 INFO nova.virt.libvirt.driver [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Deleting instance files /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0_del#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.903 226890 INFO nova.virt.libvirt.driver [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Deletion of /var/lib/nova/instances/5600f775-64ed-487b-9706-d5500878c3c0_del complete#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.942 226890 INFO nova.compute.manager [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.943 226890 DEBUG oslo.service.loopingcall [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.943 226890 DEBUG nova.compute.manager [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:36:19 np0005588920 nova_compute[226886]: 2026-01-20 14:36:19.943 226890 DEBUG nova.network.neutron [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.076 226890 DEBUG nova.network.neutron [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.093 226890 DEBUG nova.network.neutron [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.119 226890 INFO nova.compute.manager [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Took 0.18 seconds to deallocate network for instance.#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.174 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.175 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.243 226890 DEBUG oslo_concurrency.processutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3748006424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.711 226890 DEBUG oslo_concurrency.processutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.720 226890 DEBUG nova.compute.provider_tree [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.735 226890 DEBUG nova.scheduler.client.report [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.764 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.791 226890 INFO nova.scheduler.client.report [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Deleted allocations for instance 5600f775-64ed-487b-9706-d5500878c3c0#033[00m
Jan 20 09:36:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:20.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.848 226890 DEBUG oslo_concurrency.lockutils [None req-804faf35-a3d1-47f5-99e7-ebce3984b18c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "5600f775-64ed-487b-9706-d5500878c3c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:20 np0005588920 nova_compute[226886]: 2026-01-20 14:36:20.856 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:21.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.465 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.479 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.769 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:36:21 np0005588920 nova_compute[226886]: 2026-01-20 14:36:21.769 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 20 09:36:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:22.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:23.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:24 np0005588920 nova_compute[226886]: 2026-01-20 14:36:24.597 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:24.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:25.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:25 np0005588920 nova_compute[226886]: 2026-01-20 14:36:25.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:25 np0005588920 nova_compute[226886]: 2026-01-20 14:36:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:25 np0005588920 nova_compute[226886]: 2026-01-20 14:36:25.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:36:25 np0005588920 nova_compute[226886]: 2026-01-20 14:36:25.858 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:26 np0005588920 nova_compute[226886]: 2026-01-20 14:36:26.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:26 np0005588920 nova_compute[226886]: 2026-01-20 14:36:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:26.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.752 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.752 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.752 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:36:27 np0005588920 nova_compute[226886]: 2026-01-20 14:36:27.753 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2774003200' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.183 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.346 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.347 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4651MB free_disk=20.95738983154297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.348 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.348 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.414 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.415 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.428 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3876039807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:28.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.837 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.844 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.861 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.901 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:36:28 np0005588920 nova_compute[226886]: 2026-01-20 14:36:28.901 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:29.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:29 np0005588920 nova_compute[226886]: 2026-01-20 14:36:29.598 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:29 np0005588920 nova_compute[226886]: 2026-01-20 14:36:29.902 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:29 np0005588920 podman[245805]: 2026-01-20 14:36:29.973673368 +0000 UTC m=+0.055392814 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:36:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:30.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:30 np0005588920 nova_compute[226886]: 2026-01-20 14:36:30.860 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 20 09:36:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 20 09:36:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:32.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:33.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:34 np0005588920 nova_compute[226886]: 2026-01-20 14:36:34.486 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919779.4855971, 5600f775-64ed-487b-9706-d5500878c3c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:34 np0005588920 nova_compute[226886]: 2026-01-20 14:36:34.487 226890 INFO nova.compute.manager [-] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:36:34 np0005588920 nova_compute[226886]: 2026-01-20 14:36:34.506 226890 DEBUG nova.compute.manager [None req-28fd13f2-445f-4d9f-a59c-f3b6cec84bfe - - - - - -] [instance: 5600f775-64ed-487b-9706-d5500878c3c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:34 np0005588920 nova_compute[226886]: 2026-01-20 14:36:34.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:34.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 20 09:36:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:35.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:35 np0005588920 nova_compute[226886]: 2026-01-20 14:36:35.863 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 20 09:36:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:36.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:37.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:38.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:39.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:39 np0005588920 nova_compute[226886]: 2026-01-20 14:36:39.603 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:40.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:40 np0005588920 nova_compute[226886]: 2026-01-20 14:36:40.865 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:41.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 20 09:36:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:42.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:43.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.610 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.610 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.627 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.634 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.634 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.658 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.720 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.720 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.728 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.729 226890 INFO nova.compute.claims [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.755 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:43 np0005588920 nova_compute[226886]: 2026-01-20 14:36:43.909 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/658670013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.393 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.399 226890 DEBUG nova.compute.provider_tree [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.421 226890 DEBUG nova.scheduler.client.report [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.459 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.459 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.462 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.468 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.468 226890 INFO nova.compute.claims [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.539 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.539 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.563 226890 INFO nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.581 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.605 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.662 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.695 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.698 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.699 226890 INFO nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Creating image(s)#033[00m
Jan 20 09:36:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.732 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.766 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.793 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.798 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:44.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.892 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.893 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.894 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.895 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.922 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.930 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f59628d0-8f85-42c2-93ff-a052df4ac20e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:44 np0005588920 nova_compute[226886]: 2026-01-20 14:36:44.958 226890 DEBUG nova.policy [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '592a0204f38a4596ab1ab81774214a6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d78990d13704d629a8a3e8910d005c5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:36:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1444865890' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.105 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.114 226890 DEBUG nova.compute.provider_tree [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.134 226890 DEBUG nova.scheduler.client.report [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.154 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.154 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.200 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.201 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.220 226890 INFO nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.224 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f59628d0-8f85-42c2-93ff-a052df4ac20e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:45.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.258 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.301 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] resizing rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.384 226890 DEBUG nova.objects.instance [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'migration_context' on Instance uuid f59628d0-8f85-42c2-93ff-a052df4ac20e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.419 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.420 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Ensure instance console log exists: /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.420 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.420 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.420 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.513 226890 DEBUG nova.policy [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56e2959629114d3d8a48e7a80ed96c4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3750c56415134773aa9d9880038f1749', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.530 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.531 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.531 226890 INFO nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Creating image(s)#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.556 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.590 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.619 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.622 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "860c61ff3ea8a60295f9a6c8944bf9adce790eea" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.623 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "860c61ff3ea8a60295f9a6c8944bf9adce790eea" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.868 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:45 np0005588920 nova_compute[226886]: 2026-01-20 14:36:45.982 226890 DEBUG nova.virt.libvirt.imagebackend [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/d9608a6b-abac-47e3-a9dd-70a6230a92c0/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/d9608a6b-abac-47e3-a9dd-70a6230a92c0/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.067 226890 DEBUG nova.virt.libvirt.imagebackend [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/d9608a6b-abac-47e3-a9dd-70a6230a92c0/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.067 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] cloning images/d9608a6b-abac-47e3-a9dd-70a6230a92c0@snap to None/2557de26-ca24-4910-aa11-c697dc150296_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:36:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:46.189 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.191 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "860c61ff3ea8a60295f9a6c8944bf9adce790eea" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:46.192 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.326 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Successfully created port: 554f0c44-80f7-4a8b-84cb-85b70f4889a0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.335 226890 DEBUG nova.objects.instance [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'migration_context' on Instance uuid 2557de26-ca24-4910-aa11-c697dc150296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.365 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.365 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Ensure instance console log exists: /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.366 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.366 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:46 np0005588920 nova_compute[226886]: 2026-01-20 14:36:46.366 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:46.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:47 np0005588920 nova_compute[226886]: 2026-01-20 14:36:47.351 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Successfully created port: 51f66a28-a093-4e55-a9ca-b7c718ebfc5a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:36:47 np0005588920 nova_compute[226886]: 2026-01-20 14:36:47.963 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Successfully updated port: 554f0c44-80f7-4a8b-84cb-85b70f4889a0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:36:47 np0005588920 nova_compute[226886]: 2026-01-20 14:36:47.982 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:47 np0005588920 nova_compute[226886]: 2026-01-20 14:36:47.982 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquired lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:47 np0005588920 nova_compute[226886]: 2026-01-20 14:36:47.983 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:36:48 np0005588920 podman[246211]: 2026-01-20 14:36:48.06618713 +0000 UTC m=+0.149488312 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:36:48 np0005588920 nova_compute[226886]: 2026-01-20 14:36:48.260 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:48 np0005588920 nova_compute[226886]: 2026-01-20 14:36:48.416 226890 DEBUG nova.compute.manager [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-changed-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:48 np0005588920 nova_compute[226886]: 2026-01-20 14:36:48.417 226890 DEBUG nova.compute.manager [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Refreshing instance network info cache due to event network-changed-554f0c44-80f7-4a8b-84cb-85b70f4889a0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:36:48 np0005588920 nova_compute[226886]: 2026-01-20 14:36:48.417 226890 DEBUG oslo_concurrency.lockutils [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:48.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.166 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Successfully updated port: 51f66a28-a093-4e55-a9ca-b7c718ebfc5a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.199 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.200 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquired lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.200 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:36:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:49.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.413 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.618 226890 DEBUG nova.network.neutron [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Updating instance_info_cache with network_info: [{"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.632 226890 DEBUG nova.compute.manager [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-changed-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.632 226890 DEBUG nova.compute.manager [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Refreshing instance network info cache due to event network-changed-51f66a28-a093-4e55-a9ca-b7c718ebfc5a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.632 226890 DEBUG oslo_concurrency.lockutils [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.638 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Releasing lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.638 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Instance network_info: |[{"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.638 226890 DEBUG oslo_concurrency.lockutils [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.638 226890 DEBUG nova.network.neutron [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Refreshing network info cache for port 554f0c44-80f7-4a8b-84cb-85b70f4889a0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.641 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Start _get_guest_xml network_info=[{"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.645 226890 WARNING nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.649 226890 DEBUG nova.virt.libvirt.host [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.649 226890 DEBUG nova.virt.libvirt.host [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.652 226890 DEBUG nova.virt.libvirt.host [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.653 226890 DEBUG nova.virt.libvirt.host [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.654 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.654 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.655 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.655 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.655 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.655 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.656 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.656 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.656 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.656 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.657 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.657 226890 DEBUG nova.virt.hardware [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:36:49 np0005588920 nova_compute[226886]: 2026-01-20 14:36:49.660 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/347259012' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.085 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.107 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.110 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3157347975' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.529 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.531 226890 DEBUG nova.virt.libvirt.vif [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-36871268',display_name='tempest-ImagesOneServerNegativeTestJSON-server-36871268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-36871268',id=56,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-oo656c53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:44Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=f59628d0-8f85-42c2-93ff-a052df4ac20e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.531 226890 DEBUG nova.network.os_vif_util [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.532 226890 DEBUG nova.network.os_vif_util [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.533 226890 DEBUG nova.objects.instance [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'pci_devices' on Instance uuid f59628d0-8f85-42c2-93ff-a052df4ac20e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.562 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <uuid>f59628d0-8f85-42c2-93ff-a052df4ac20e</uuid>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <name>instance-00000038</name>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-36871268</nova:name>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:36:49</nova:creationTime>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:user uuid="592a0204f38a4596ab1ab81774214a6d">tempest-ImagesOneServerNegativeTestJSON-866315696-project-member</nova:user>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:project uuid="7d78990d13704d629a8a3e8910d005c5">tempest-ImagesOneServerNegativeTestJSON-866315696</nova:project>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <nova:port uuid="554f0c44-80f7-4a8b-84cb-85b70f4889a0">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="serial">f59628d0-8f85-42c2-93ff-a052df4ac20e</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="uuid">f59628d0-8f85-42c2-93ff-a052df4ac20e</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f59628d0-8f85-42c2-93ff-a052df4ac20e_disk">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cd:86:1f"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <target dev="tap554f0c44-80"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/console.log" append="off"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:36:50 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:36:50 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:36:50 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:36:50 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.563 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Preparing to wait for external event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.564 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.564 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.564 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.565 226890 DEBUG nova.virt.libvirt.vif [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-36871268',display_name='tempest-ImagesOneServerNegativeTestJSON-server-36871268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-36871268',id=56,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-oo656c53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:44Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=f59628d0-8f85-42c2-93ff-a052df4ac20e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.565 226890 DEBUG nova.network.os_vif_util [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.566 226890 DEBUG nova.network.os_vif_util [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.566 226890 DEBUG os_vif [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.566 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.567 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.567 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.570 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap554f0c44-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.571 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap554f0c44-80, col_values=(('external_ids', {'iface-id': '554f0c44-80f7-4a8b-84cb-85b70f4889a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:86:1f', 'vm-uuid': 'f59628d0-8f85-42c2-93ff-a052df4ac20e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:50 np0005588920 NetworkManager[49076]: <info>  [1768919810.6111] manager: (tap554f0c44-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.612 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.618 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.620 226890 INFO os_vif [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80')#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.720 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.721 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.721 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No VIF found with MAC fa:16:3e:cd:86:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.722 226890 INFO nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Using config drive#033[00m
Jan 20 09:36:50 np0005588920 nova_compute[226886]: 2026-01-20 14:36:50.743 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:50.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.236 226890 DEBUG nova.network.neutron [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Updating instance_info_cache with network_info: [{"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:51.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.408 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Releasing lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.409 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Instance network_info: |[{"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.409 226890 DEBUG oslo_concurrency.lockutils [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.409 226890 DEBUG nova.network.neutron [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Refreshing network info cache for port 51f66a28-a093-4e55-a9ca-b7c718ebfc5a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.412 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Start _get_guest_xml network_info=[{"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:36:30Z,direct_url=<?>,disk_format='raw',id=d9608a6b-abac-47e3-a9dd-70a6230a92c0,min_disk=1,min_ram=0,name='tempest-test-snap-969462570',owner='3750c56415134773aa9d9880038f1749',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:36:37Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'd9608a6b-abac-47e3-a9dd-70a6230a92c0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.415 226890 WARNING nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.423 226890 DEBUG nova.virt.libvirt.host [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.424 226890 DEBUG nova.virt.libvirt.host [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.427 226890 DEBUG nova.virt.libvirt.host [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.428 226890 DEBUG nova.virt.libvirt.host [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.428 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.429 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:36:30Z,direct_url=<?>,disk_format='raw',id=d9608a6b-abac-47e3-a9dd-70a6230a92c0,min_disk=1,min_ram=0,name='tempest-test-snap-969462570',owner='3750c56415134773aa9d9880038f1749',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:36:37Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.429 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.429 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.430 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.430 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.430 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.431 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.431 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.431 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.431 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.431 226890 DEBUG nova.virt.hardware [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.434 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3250829967' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.868 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.891 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:51 np0005588920 nova_compute[226886]: 2026-01-20 14:36:51.895 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:52.194 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3598022065' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.332 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.335 226890 DEBUG nova.virt.libvirt.vif [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2069954523',display_name='tempest-ImagesTestJSON-server-2069954523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2069954523',id=55,image_ref='d9608a6b-abac-47e3-a9dd-70a6230a92c0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-gzsfcx0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='59387c9d-df91-4f43-b389-00174486fc84',image_min_disk='1',image_min_ram='0',image_owner_id='3750c56415134773aa9d9880038f1749',image_owner_project_name='tempest-ImagesTestJSON-338390217',image_owner_user_name='tempest-ImagesTestJSON-338390217-project-member',image_user_id='56e2959629114d3d8a48e7a80ed96c4b',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:45Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=2557de26-ca24-4910-aa11-c697dc150296,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.336 226890 DEBUG nova.network.os_vif_util [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.338 226890 DEBUG nova.network.os_vif_util [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.340 226890 DEBUG nova.objects.instance [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2557de26-ca24-4910-aa11-c697dc150296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.661 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <uuid>2557de26-ca24-4910-aa11-c697dc150296</uuid>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <name>instance-00000037</name>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:name>tempest-ImagesTestJSON-server-2069954523</nova:name>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:36:51</nova:creationTime>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:user uuid="56e2959629114d3d8a48e7a80ed96c4b">tempest-ImagesTestJSON-338390217-project-member</nova:user>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:project uuid="3750c56415134773aa9d9880038f1749">tempest-ImagesTestJSON-338390217</nova:project>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="d9608a6b-abac-47e3-a9dd-70a6230a92c0"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <nova:port uuid="51f66a28-a093-4e55-a9ca-b7c718ebfc5a">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="serial">2557de26-ca24-4910-aa11-c697dc150296</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="uuid">2557de26-ca24-4910-aa11-c697dc150296</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2557de26-ca24-4910-aa11-c697dc150296_disk">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2557de26-ca24-4910-aa11-c697dc150296_disk.config">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:ce:13:9c"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <target dev="tap51f66a28-a0"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/console.log" append="off"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:36:52 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:36:52 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:36:52 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:36:52 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.662 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Preparing to wait for external event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.663 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.663 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.663 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.664 226890 DEBUG nova.virt.libvirt.vif [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2069954523',display_name='tempest-ImagesTestJSON-server-2069954523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2069954523',id=55,image_ref='d9608a6b-abac-47e3-a9dd-70a6230a92c0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-gzsfcx0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='59387c9d-df91-4f43-b389-00174486fc84',image_min_disk='1',image_min_ram='0',image_owner_id='3750c56415134773aa9d9880038f1749',image_owner_project_name='tempest-ImagesTestJSON-338390217',image_owner_user_name='tempest-ImagesTestJSON-338390217-project-member',image_user_id='56e2959629114d3d8a48e7a80ed96c4b',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:45Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=2557de26-ca24-4910-aa11-c697dc150296,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.664 226890 DEBUG nova.network.os_vif_util [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.665 226890 DEBUG nova.network.os_vif_util [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.665 226890 DEBUG os_vif [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.667 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.667 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.670 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51f66a28-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.670 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51f66a28-a0, col_values=(('external_ids', {'iface-id': '51f66a28-a093-4e55-a9ca-b7c718ebfc5a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:13:9c', 'vm-uuid': '2557de26-ca24-4910-aa11-c697dc150296'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:52 np0005588920 NetworkManager[49076]: <info>  [1768919812.6732] manager: (tap51f66a28-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.675 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.679 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:52 np0005588920 nova_compute[226886]: 2026-01-20 14:36:52.680 226890 INFO os_vif [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0')#033[00m
Jan 20 09:36:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:52.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:53.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.447 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.448 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.448 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No VIF found with MAC fa:16:3e:ce:13:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.448 226890 INFO nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Using config drive#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.467 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.819 226890 DEBUG nova.network.neutron [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Updated VIF entry in instance network info cache for port 554f0c44-80f7-4a8b-84cb-85b70f4889a0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.820 226890 DEBUG nova.network.neutron [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Updating instance_info_cache with network_info: [{"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.841 226890 INFO nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Creating config drive at /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.846 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiydb0nkr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:53 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.974 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiydb0nkr" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:53.999 226890 DEBUG nova.storage.rbd_utils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.003 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.217 226890 DEBUG oslo_concurrency.processutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config f59628d0-8f85-42c2-93ff-a052df4ac20e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.218 226890 INFO nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Deleting local config drive /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e/disk.config because it was imported into RBD.#033[00m
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.2743] manager: (tap554f0c44-80): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 20 09:36:54 np0005588920 kernel: tap554f0c44-80: entered promiscuous mode
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.279 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00137|binding|INFO|Claiming lport 554f0c44-80f7-4a8b-84cb-85b70f4889a0 for this chassis.
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00138|binding|INFO|554f0c44-80f7-4a8b-84cb-85b70f4889a0: Claiming fa:16:3e:cd:86:1f 10.100.0.11
Jan 20 09:36:54 np0005588920 systemd-udevd[246457]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:36:54 np0005588920 systemd-machined[196121]: New machine qemu-23-instance-00000038.
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.3172] device (tap554f0c44-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.3180] device (tap554f0c44-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:36:54 np0005588920 systemd[1]: Started Virtual Machine qemu-23-instance-00000038.
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.376 226890 INFO nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Creating config drive at /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.383 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvyd0ruv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00139|binding|INFO|Setting lport 554f0c44-80f7-4a8b-84cb-85b70f4889a0 ovn-installed in OVS
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.569 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvyd0ruv" returned: 0 in 0.186s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.590 226890 DEBUG nova.storage.rbd_utils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 2557de26-ca24-4910-aa11-c697dc150296_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.593 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config 2557de26-ca24-4910-aa11-c697dc150296_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.615 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00140|binding|INFO|Setting lport 554f0c44-80f7-4a8b-84cb-85b70f4889a0 up in Southbound
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.762 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:86:1f 10.100.0.11'], port_security=['fa:16:3e:cd:86:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f59628d0-8f85-42c2-93ff-a052df4ac20e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d78990d13704d629a8a3e8910d005c5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3763ece7-c739-40ca-8e07-6dde1584ba85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a613141e-df34-49c4-9712-c3d232327d6b, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=554f0c44-80f7-4a8b-84cb-85b70f4889a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.763 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 554f0c44-80f7-4a8b-84cb-85b70f4889a0 in datapath b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 bound to our chassis#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.764 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.779 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[87c40296-9bf3-43d1-95fb-88e31d300dae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.780 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1f372f9-f1 in ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.783 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1f372f9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.783 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45a5b4dc-f1e3-4e92-b34a-8baab7268a73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.784 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cad0a2b6-a0a3-41da-826f-ae9b53463d38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.785 226890 DEBUG oslo_concurrency.lockutils [req-fd10b042-54fe-4cb6-97a7-dcb645437d77 req-3cf3c1f0-3927-45db-9a1c-1c358cc343f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f59628d0-8f85-42c2-93ff-a052df4ac20e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.798 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[061f2d5e-2497-4fa3-b286-5f51691d1557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.822 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7ed91e-722d-44e3-a483-d186bfb422be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.848 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[607e1ab2-cb67-4a50-a942-fa8e6ffd3fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.854 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30396273-f679-42b5-8184-918ec5b326f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.8560] manager: (tapb1f372f9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Jan 20 09:36:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:54.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.885 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[14ed2f00-7690-403c-bc6e-8fa8cce57d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.891 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d7a929-1547-4668-953a-e0c0fd7e35f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.900 226890 DEBUG oslo_concurrency.processutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config 2557de26-ca24-4910-aa11-c697dc150296_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.902 226890 INFO nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Deleting local config drive /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296/disk.config because it was imported into RBD.#033[00m
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.9188] device (tapb1f372f9-f0): carrier: link connected
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.924 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2460408a-b7dd-4909-8879-159642f0f906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.942 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae55755-b4c0-48ae-85fa-5f096e07730a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1f372f9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:d0:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485660, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246539, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.9521] manager: (tap51f66a28-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 20 09:36:54 np0005588920 kernel: tap51f66a28-a0: entered promiscuous mode
Jan 20 09:36:54 np0005588920 systemd-udevd[246518]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.954 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00141|binding|INFO|Claiming lport 51f66a28-a093-4e55-a9ca-b7c718ebfc5a for this chassis.
Jan 20 09:36:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:54Z|00142|binding|INFO|51f66a28-a093-4e55-a9ca-b7c718ebfc5a: Claiming fa:16:3e:ce:13:9c 10.100.0.5
Jan 20 09:36:54 np0005588920 nova_compute[226886]: 2026-01-20 14:36:54.958 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.9637] device (tap51f66a28-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:36:54 np0005588920 NetworkManager[49076]: <info>  [1768919814.9648] device (tap51f66a28-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.966 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:13:9c 10.100.0.5'], port_security=['fa:16:3e:ce:13:9c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2557de26-ca24-4910-aa11-c697dc150296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=51f66a28-a093-4e55-a9ca-b7c718ebfc5a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.966 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[349f395a-3d6a-4f47-ab26-5a39e8170847]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:d0c5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485660, 'tstamp': 485660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246551, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:54 np0005588920 systemd-machined[196121]: New machine qemu-24-instance-00000037.
Jan 20 09:36:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:54.983 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[53cd2806-d6d6-4995-992f-9228ed494b8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1f372f9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:d0:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485660, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246564, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.012 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4fecd14b-e73b-4f53-ab3c-99c2559a8ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 systemd[1]: Started Virtual Machine qemu-24-instance-00000037.
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:55Z|00143|binding|INFO|Setting lport 51f66a28-a093-4e55-a9ca-b7c718ebfc5a ovn-installed in OVS
Jan 20 09:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:55Z|00144|binding|INFO|Setting lport 51f66a28-a093-4e55-a9ca-b7c718ebfc5a up in Southbound
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.036 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.073 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[49d93ca8-3dce-4de1-b1bc-34c3696711cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.074 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f372f9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.074 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.075 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f372f9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 NetworkManager[49076]: <info>  [1768919815.0783] manager: (tapb1f372f9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.077 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 kernel: tapb1f372f9-f0: entered promiscuous mode
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.081 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.082 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1f372f9-f0, col_values=(('external_ids', {'iface-id': 'f0137d70-4bff-4646-9f70-7e0c82ac1e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:55Z|00145|binding|INFO|Releasing lport f0137d70-4bff-4646-9f70-7e0c82ac1e88 from this chassis (sb_readonly=0)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.100 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.101 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[97bb94e5-c14d-47b5-825f-70470523d32c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.102 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:36:55 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy
Jan 20 09:36:55 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.103 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'env', 'PROCESS_TAG=haproxy-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.216 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919815.2162316, f59628d0-8f85-42c2-93ff-a052df4ac20e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.217 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] VM Started (Lifecycle Event)#033[00m
Jan 20 09:36:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:55.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.374 226890 DEBUG nova.compute.manager [req-c0bf54f6-b4ab-4aa5-9d73-dc4576b2852a req-cba095a3-391f-4b06-b910-a006caa31d64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.376 226890 DEBUG oslo_concurrency.lockutils [req-c0bf54f6-b4ab-4aa5-9d73-dc4576b2852a req-cba095a3-391f-4b06-b910-a006caa31d64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.376 226890 DEBUG oslo_concurrency.lockutils [req-c0bf54f6-b4ab-4aa5-9d73-dc4576b2852a req-cba095a3-391f-4b06-b910-a006caa31d64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.377 226890 DEBUG oslo_concurrency.lockutils [req-c0bf54f6-b4ab-4aa5-9d73-dc4576b2852a req-cba095a3-391f-4b06-b910-a006caa31d64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.377 226890 DEBUG nova.compute.manager [req-c0bf54f6-b4ab-4aa5-9d73-dc4576b2852a req-cba095a3-391f-4b06-b910-a006caa31d64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Processing event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.379 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.384 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.389 226890 INFO nova.virt.libvirt.driver [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Instance spawned successfully.#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.389 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.393 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.397 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.417 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.418 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919815.2184775, f59628d0-8f85-42c2-93ff-a052df4ac20e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.418 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.430 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.430 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.431 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.431 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.431 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.432 226890 DEBUG nova.virt.libvirt.driver [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.440 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.443 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919815.3830316, f59628d0-8f85-42c2-93ff-a052df4ac20e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.443 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.462 226890 DEBUG nova.network.neutron [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Updated VIF entry in instance network info cache for port 51f66a28-a093-4e55-a9ca-b7c718ebfc5a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.462 226890 DEBUG nova.network.neutron [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Updating instance_info_cache with network_info: [{"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.471 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.475 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.482 226890 DEBUG oslo_concurrency.lockutils [req-25967f6f-7d9b-4bc9-aa99-eb8128786dc3 req-6ac6465b-b8bc-41ad-bde6-97d938c1e5b5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2557de26-ca24-4910-aa11-c697dc150296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:55 np0005588920 podman[246665]: 2026-01-20 14:36:55.494395432 +0000 UTC m=+0.068695637 container create d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.506 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.516 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919815.5163808, 2557de26-ca24-4910-aa11-c697dc150296 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.516 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] VM Started (Lifecycle Event)#033[00m
Jan 20 09:36:55 np0005588920 podman[246665]: 2026-01-20 14:36:55.449615526 +0000 UTC m=+0.023915761 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:36:55 np0005588920 systemd[1]: Started libpod-conmon-d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8.scope.
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.547 226890 INFO nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Took 10.85 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.547 226890 DEBUG nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.549 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.555 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919815.516804, 2557de26-ca24-4910-aa11-c697dc150296 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.556 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:36:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:36:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53a2b302adf6a116746d9faa9c8f11e8958d034dd25562c4d4965c5ef2cff97d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.582 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.585 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:55 np0005588920 podman[246665]: 2026-01-20 14:36:55.601541926 +0000 UTC m=+0.175842151 container init d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:36:55 np0005588920 podman[246665]: 2026-01-20 14:36:55.607814441 +0000 UTC m=+0.182114646 container start d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.615 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:55 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [NOTICE]   (246690) : New worker (246692) forked
Jan 20 09:36:55 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [NOTICE]   (246690) : Loading success.
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.636 226890 INFO nova.compute.manager [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Took 11.95 seconds to build instance.#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.662 226890 DEBUG oslo_concurrency.lockutils [None req-08cff503-6fba-48ec-b13a-47c8d75da11b 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.668 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 51f66a28-a093-4e55-a9ca-b7c718ebfc5a in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a unbound from our chassis#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.670 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network abb83e3e-0b12-431b-ad86-a1d271b5b46a#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.683 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff1f03c-009c-42f1-a44a-e384e37db2fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.684 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapabb83e3e-01 in ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.687 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapabb83e3e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.687 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9de1497-94ce-49dc-96f9-c3afbf96c895]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.688 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a9aebc-7591-431e-8247-3d9652ed6a5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.700 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[42678918-8860-4d70-bd5f-da7a59dc083d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.726 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d1e971-f79b-48fa-9df5-af477bf793e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.754 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4aa38b-30bf-49e2-a839-22d273866690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.760 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[add67cdc-d0dd-4e91-a62b-7c3da1b3af62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 NetworkManager[49076]: <info>  [1768919815.7617] manager: (tapabb83e3e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.795 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[317388a6-fbdc-42f3-b475-9ddd376ff28f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.798 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ea536665-24a8-4f14-b2ae-b665e992b073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 NetworkManager[49076]: <info>  [1768919815.8182] device (tapabb83e3e-00): carrier: link connected
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.823 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[810866da-59d9-42e4-b239-4c89f0f236e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.839 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4885c017-baef-4247-b056-c4e633f3768f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485750, 'reachable_time': 33489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246711, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.856 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[53a75878-5aca-4d35-b8bf-7f03a2921226]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485750, 'tstamp': 485750}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246712, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.873 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ca3ca6-48b8-4d72-84fc-27d9f5da1afc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485750, 'reachable_time': 33489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246713, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.907 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a23f7528-8068-4a48-b755-33fc4c43940b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.964 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5e228e36-c1b4-4ecc-a358-d3ebfd84c256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.965 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.966 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.966 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabb83e3e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 NetworkManager[49076]: <info>  [1768919815.9688] manager: (tapabb83e3e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 20 09:36:55 np0005588920 kernel: tapabb83e3e-00: entered promiscuous mode
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.971 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapabb83e3e-00, col_values=(('external_ids', {'iface-id': 'dfacaf19-f896-4c13-a7ad-47b57cf03fc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.972 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:55Z|00146|binding|INFO|Releasing lport dfacaf19-f896-4c13-a7ad-47b57cf03fc1 from this chassis (sb_readonly=0)
Jan 20 09:36:55 np0005588920 nova_compute[226886]: 2026-01-20 14:36:55.988 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.990 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.990 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f17f1bc7-4dd1-42aa-b3d9-343f5b5d25c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.991 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:36:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:55.992 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'env', 'PROCESS_TAG=haproxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/abb83e3e-0b12-431b-ad86-a1d271b5b46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:36:56 np0005588920 podman[246745]: 2026-01-20 14:36:56.390808924 +0000 UTC m=+0.086436584 container create ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:36:56 np0005588920 podman[246745]: 2026-01-20 14:36:56.329532336 +0000 UTC m=+0.025160016 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:36:56 np0005588920 systemd[1]: Started libpod-conmon-ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833.scope.
Jan 20 09:36:56 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:36:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13bbd3bd4fbaa21b36f2a887d6e6c9fb156a8f0902b642990c288a27c4b9f48a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:36:56 np0005588920 podman[246745]: 2026-01-20 14:36:56.480454667 +0000 UTC m=+0.176082337 container init ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:36:56 np0005588920 podman[246745]: 2026-01-20 14:36:56.485692814 +0000 UTC m=+0.181320464 container start ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:36:56 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [NOTICE]   (246764) : New worker (246766) forked
Jan 20 09:36:56 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [NOTICE]   (246764) : Loading success.
Jan 20 09:36:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:56.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:57.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.599 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.600 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.600 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.600 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.600 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] No waiting events found dispatching network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.600 226890 WARNING nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received unexpected event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.601 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.601 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.601 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.601 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.601 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Processing event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 DEBUG oslo_concurrency.lockutils [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 DEBUG nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] No waiting events found dispatching network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.602 226890 WARNING nova.compute.manager [req-6587ea69-e61e-45b5-baf6-d4ab23ae2c54 req-f3b2351a-886e-4a78-a178-9aa649df998e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received unexpected event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.603 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.606 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919817.606539, 2557de26-ca24-4910-aa11-c697dc150296 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.607 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.608 226890 DEBUG nova.virt.libvirt.driver [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.611 226890 INFO nova.virt.libvirt.driver [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Instance spawned successfully.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.611 226890 INFO nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Took 12.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.611 226890 DEBUG nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.629 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.632 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.676 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.687 226890 INFO nova.compute.manager [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Took 13.95 seconds to build instance.#033[00m
Jan 20 09:36:57 np0005588920 nova_compute[226886]: 2026-01-20 14:36:57.742 226890 DEBUG oslo_concurrency.lockutils [None req-82dfb55c-8f3d-4f2a-96fa-8bb324f0f39b 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.700 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.700 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.701 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.701 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.701 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.702 226890 INFO nova.compute.manager [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Terminating instance#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.703 226890 DEBUG nova.compute.manager [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:36:58 np0005588920 kernel: tap51f66a28-a0 (unregistering): left promiscuous mode
Jan 20 09:36:58 np0005588920 NetworkManager[49076]: <info>  [1768919818.7483] device (tap51f66a28-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.756 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:58Z|00147|binding|INFO|Releasing lport 51f66a28-a093-4e55-a9ca-b7c718ebfc5a from this chassis (sb_readonly=0)
Jan 20 09:36:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:58Z|00148|binding|INFO|Setting lport 51f66a28-a093-4e55-a9ca-b7c718ebfc5a down in Southbound
Jan 20 09:36:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:36:58Z|00149|binding|INFO|Removing iface tap51f66a28-a0 ovn-installed in OVS
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.761 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:58.769 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:13:9c 10.100.0.5'], port_security=['fa:16:3e:ce:13:9c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2557de26-ca24-4910-aa11-c697dc150296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=51f66a28-a093-4e55-a9ca-b7c718ebfc5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:58.771 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 51f66a28-a093-4e55-a9ca-b7c718ebfc5a in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a unbound from our chassis#033[00m
Jan 20 09:36:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:58.772 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network abb83e3e-0b12-431b-ad86-a1d271b5b46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:36:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:58.773 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c01a2144-fea1-4894-87e7-f38769aeb6e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:58.773 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace which is not needed anymore#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.775 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:58 np0005588920 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 20 09:36:58 np0005588920 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Consumed 1.524s CPU time.
Jan 20 09:36:58 np0005588920 systemd-machined[196121]: Machine qemu-24-instance-00000037 terminated.
Jan 20 09:36:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:58.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:58 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [NOTICE]   (246764) : haproxy version is 2.8.14-c23fe91
Jan 20 09:36:58 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [NOTICE]   (246764) : path to executable is /usr/sbin/haproxy
Jan 20 09:36:58 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [WARNING]  (246764) : Exiting Master process...
Jan 20 09:36:58 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [ALERT]    (246764) : Current worker (246766) exited with code 143 (Terminated)
Jan 20 09:36:58 np0005588920 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[246760]: [WARNING]  (246764) : All workers exited. Exiting... (0)
Jan 20 09:36:58 np0005588920 systemd[1]: libpod-ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833.scope: Deactivated successfully.
Jan 20 09:36:58 np0005588920 podman[246798]: 2026-01-20 14:36:58.922927946 +0000 UTC m=+0.044305382 container died ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.938 226890 INFO nova.virt.libvirt.driver [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Instance destroyed successfully.#033[00m
Jan 20 09:36:58 np0005588920 nova_compute[226886]: 2026-01-20 14:36:58.939 226890 DEBUG nova.objects.instance [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'resources' on Instance uuid 2557de26-ca24-4910-aa11-c697dc150296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:58 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833-userdata-shm.mount: Deactivated successfully.
Jan 20 09:36:58 np0005588920 systemd[1]: var-lib-containers-storage-overlay-13bbd3bd4fbaa21b36f2a887d6e6c9fb156a8f0902b642990c288a27c4b9f48a-merged.mount: Deactivated successfully.
Jan 20 09:36:58 np0005588920 podman[246798]: 2026-01-20 14:36:58.996034206 +0000 UTC m=+0.117411652 container cleanup ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:36:59 np0005588920 systemd[1]: libpod-conmon-ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833.scope: Deactivated successfully.
Jan 20 09:36:59 np0005588920 podman[246836]: 2026-01-20 14:36:59.051498671 +0000 UTC m=+0.035901148 container remove ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.056 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5120171d-8fbc-462d-bb4e-6789df15ec8c]: (4, ('Tue Jan 20 02:36:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833)\nac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833\nTue Jan 20 02:36:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (ac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833)\nac795d44221b79e41992ace15ab7d41fb457d7fcbed8aa6db79fabfcc750e833\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.058 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b18c495c-faff-41e2-b8e4-75ac59d2141f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.059 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 kernel: tapabb83e3e-00: left promiscuous mode
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.084 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[467ee900-a65e-4b9e-90b0-c1e6b0f5318a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.098 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5e8fc8-bbe5-4c79-ba0f-167464272c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.099 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7247703e-d1e9-4b40-8092-f3ee49fdcb73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.114 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f64acdf3-e602-469d-9459-9a6c8ffb1bf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485743, 'reachable_time': 38461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246855, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 systemd[1]: run-netns-ovnmeta\x2dabb83e3e\x2d0b12\x2d431b\x2dad86\x2da1d271b5b46a.mount: Deactivated successfully.
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.116 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:36:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:36:59.116 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[09efeb8b-0dd2-4543-b73b-f911c1c9c2a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:36:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:59.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.609 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.880 226890 DEBUG nova.virt.libvirt.vif [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2069954523',display_name='tempest-ImagesTestJSON-server-2069954523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2069954523',id=55,image_ref='d9608a6b-abac-47e3-a9dd-70a6230a92c0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:36:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-gzsfcx0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='59387c9d-df91-4f43-b389-00174486fc84',image_min_disk='1',image_min_ram='0',image_owner_id='3750c56415134773aa9d9880038f1749',image_owner_project_name='tempest-ImagesTestJSON-338390217',image_owner_user_name='tempest-ImagesTestJSON-338390217-project-member',image_user_id='56e2959629114d3d8a48e7a80ed96c4b',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:36:57Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=2557de26-ca24-4910-aa11-c697dc150296,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.881 226890 DEBUG nova.network.os_vif_util [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "address": "fa:16:3e:ce:13:9c", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51f66a28-a0", "ovs_interfaceid": "51f66a28-a093-4e55-a9ca-b7c718ebfc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.881 226890 DEBUG nova.network.os_vif_util [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.881 226890 DEBUG os_vif [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.883 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.884 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51f66a28-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.887 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.889 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588920 nova_compute[226886]: 2026-01-20 14:36:59.893 226890 INFO os_vif [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:13:9c,bridge_name='br-int',has_traffic_filtering=True,id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51f66a28-a0')#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.256 226890 INFO nova.virt.libvirt.driver [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Deleting instance files /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296_del#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.257 226890 INFO nova.virt.libvirt.driver [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Deletion of /var/lib/nova/instances/2557de26-ca24-4910-aa11-c697dc150296_del complete#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.354 226890 INFO nova.compute.manager [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.355 226890 DEBUG oslo.service.loopingcall [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.356 226890 DEBUG nova.compute.manager [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.356 226890 DEBUG nova.network.neutron [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.690 226890 DEBUG nova.compute.manager [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-unplugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.691 226890 DEBUG oslo_concurrency.lockutils [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.691 226890 DEBUG oslo_concurrency.lockutils [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.691 226890 DEBUG oslo_concurrency.lockutils [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.692 226890 DEBUG nova.compute.manager [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] No waiting events found dispatching network-vif-unplugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:00 np0005588920 nova_compute[226886]: 2026-01-20 14:37:00.692 226890 DEBUG nova.compute.manager [req-9482c594-657e-4973-99f7-9bf3d0ccb187 req-3e680a1b-9b08-438b-8854-fe66f6f57367 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-unplugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:37:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:00.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:00 np0005588920 podman[246875]: 2026-01-20 14:37:00.964429263 +0000 UTC m=+0.048368347 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:37:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:01.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:02.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:03.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.465 226890 DEBUG nova.compute.manager [req-540c304a-dc3a-4c1f-97e3-871bd0661f3e req-764d35c4-af27-434e-8adc-afaa8574f1f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-deleted-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.466 226890 INFO nova.compute.manager [req-540c304a-dc3a-4c1f-97e3-871bd0661f3e req-764d35c4-af27-434e-8adc-afaa8574f1f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Neutron deleted interface 51f66a28-a093-4e55-a9ca-b7c718ebfc5a; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.466 226890 DEBUG nova.network.neutron [req-540c304a-dc3a-4c1f-97e3-871bd0661f3e req-764d35c4-af27-434e-8adc-afaa8574f1f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.491 226890 DEBUG nova.network.neutron [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.522 226890 INFO nova.compute.manager [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Took 3.17 seconds to deallocate network for instance.#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.526 226890 DEBUG nova.compute.manager [req-540c304a-dc3a-4c1f-97e3-871bd0661f3e req-764d35c4-af27-434e-8adc-afaa8574f1f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Detach interface failed, port_id=51f66a28-a093-4e55-a9ca-b7c718ebfc5a, reason: Instance 2557de26-ca24-4910-aa11-c697dc150296 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.578 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.579 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:03 np0005588920 nova_compute[226886]: 2026-01-20 14:37:03.971 226890 DEBUG oslo_concurrency.processutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.009 226890 DEBUG nova.compute.manager [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.010 226890 DEBUG oslo_concurrency.lockutils [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2557de26-ca24-4910-aa11-c697dc150296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.011 226890 DEBUG oslo_concurrency.lockutils [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.011 226890 DEBUG oslo_concurrency.lockutils [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.011 226890 DEBUG nova.compute.manager [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] No waiting events found dispatching network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.012 226890 WARNING nova.compute.manager [req-1acebad5-a9d0-4701-aa5f-24b40a09f5c5 req-0a7434d6-edfa-4007-a2f0-c574bca55857 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Received unexpected event network-vif-plugged-51f66a28-a093-4e55-a9ca-b7c718ebfc5a for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:37:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/287032635' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.522 226890 DEBUG oslo_concurrency.processutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.527 226890 DEBUG nova.compute.provider_tree [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.554 226890 DEBUG nova.scheduler.client.report [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.587 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.624 226890 INFO nova.scheduler.client.report [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Deleted allocations for instance 2557de26-ca24-4910-aa11-c697dc150296#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.710 226890 DEBUG oslo_concurrency.lockutils [None req-0943bf4a-f11e-44e5-9174-064351e1c1b6 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "2557de26-ca24-4910-aa11-c697dc150296" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:04.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:04 np0005588920 nova_compute[226886]: 2026-01-20 14:37:04.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:05.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 09:37:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:06.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 09:37:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 20 09:37:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:08Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:86:1f 10.100.0.11
Jan 20 09:37:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:08Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:86:1f 10.100.0.11
Jan 20 09:37:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:37:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:37:08 np0005588920 nova_compute[226886]: 2026-01-20 14:37:08.581 226890 DEBUG nova.compute.manager [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:08 np0005588920 nova_compute[226886]: 2026-01-20 14:37:08.635 226890 INFO nova.compute.manager [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] instance snapshotting#033[00m
Jan 20 09:37:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:08.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:09.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:09 np0005588920 nova_compute[226886]: 2026-01-20 14:37:09.273 226890 INFO nova.virt.libvirt.driver [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Beginning live snapshot process#033[00m
Jan 20 09:37:09 np0005588920 nova_compute[226886]: 2026-01-20 14:37:09.452 226890 DEBUG nova.virt.libvirt.imagebackend [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:37:09 np0005588920 nova_compute[226886]: 2026-01-20 14:37:09.725 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:09 np0005588920 nova_compute[226886]: 2026-01-20 14:37:09.734 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] creating snapshot(7e947617ede7466a8af43956e63c15bc) on rbd image(f59628d0-8f85-42c2-93ff-a052df4ac20e_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:37:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:10.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:10 np0005588920 nova_compute[226886]: 2026-01-20 14:37:10.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:11.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 20 09:37:11 np0005588920 nova_compute[226886]: 2026-01-20 14:37:11.975 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] cloning vms/f59628d0-8f85-42c2-93ff-a052df4ac20e_disk@7e947617ede7466a8af43956e63c15bc to images/d81709b0-d34f-4017-a47c-7ee2d9a8bd84 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:37:12 np0005588920 nova_compute[226886]: 2026-01-20 14:37:12.118 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] flattening images/d81709b0-d34f-4017-a47c-7ee2d9a8bd84 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:37:12 np0005588920 nova_compute[226886]: 2026-01-20 14:37:12.481 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] removing snapshot(7e947617ede7466a8af43956e63c15bc) on rbd image(f59628d0-8f85-42c2-93ff-a052df4ac20e_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:37:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 20 09:37:12 np0005588920 nova_compute[226886]: 2026-01-20 14:37:12.856 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] creating snapshot(snap) on rbd image(d81709b0-d34f-4017-a47c-7ee2d9a8bd84) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:37:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:12.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:13.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 20 09:37:13 np0005588920 nova_compute[226886]: 2026-01-20 14:37:13.937 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919818.9368317, 2557de26-ca24-4910-aa11-c697dc150296 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:13 np0005588920 nova_compute[226886]: 2026-01-20 14:37:13.937 226890 INFO nova.compute.manager [-] [instance: 2557de26-ca24-4910-aa11-c697dc150296] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:37:13 np0005588920 nova_compute[226886]: 2026-01-20 14:37:13.978 226890 DEBUG nova.compute.manager [None req-1933bf41-4f34-474f-8889-59a6db5cef9c - - - - - -] [instance: 2557de26-ca24-4910-aa11-c697dc150296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image d81709b0-d34f-4017-a47c-7ee2d9a8bd84 could not be found.
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID d81709b0-d34f-4017-a47c-7ee2d9a8bd84
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver 
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver 
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image d81709b0-d34f-4017-a47c-7ee2d9a8bd84 could not be found.
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.117 226890 ERROR nova.virt.libvirt.driver #033[00m
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.216 226890 DEBUG nova.storage.rbd_utils [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] removing snapshot(snap) on rbd image(d81709b0-d34f-4017-a47c-7ee2d9a8bd84) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:37:14 np0005588920 nova_compute[226886]: 2026-01-20 14:37:14.743 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:14.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 20 09:37:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:15.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:15 np0005588920 nova_compute[226886]: 2026-01-20 14:37:15.378 226890 WARNING nova.compute.manager [None req-b99a27ec-68ac-495d-ab87-42441c3dc287 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Image not found during snapshot: nova.exception.ImageNotFound: Image d81709b0-d34f-4017-a47c-7ee2d9a8bd84 could not be found.#033[00m
Jan 20 09:37:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:15 np0005588920 nova_compute[226886]: 2026-01-20 14:37:15.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:16.440 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:16.440 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:16.441 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:16.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:17.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.408 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.409 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.410 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.411 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.411 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.412 226890 INFO nova.compute.manager [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Terminating instance#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.413 226890 DEBUG nova.compute.manager [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:37:18 np0005588920 kernel: tap554f0c44-80 (unregistering): left promiscuous mode
Jan 20 09:37:18 np0005588920 NetworkManager[49076]: <info>  [1768919838.4727] device (tap554f0c44-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.491 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:18Z|00150|binding|INFO|Releasing lport 554f0c44-80f7-4a8b-84cb-85b70f4889a0 from this chassis (sb_readonly=0)
Jan 20 09:37:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:18Z|00151|binding|INFO|Setting lport 554f0c44-80f7-4a8b-84cb-85b70f4889a0 down in Southbound
Jan 20 09:37:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:18Z|00152|binding|INFO|Removing iface tap554f0c44-80 ovn-installed in OVS
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.500 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:86:1f 10.100.0.11'], port_security=['fa:16:3e:cd:86:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f59628d0-8f85-42c2-93ff-a052df4ac20e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d78990d13704d629a8a3e8910d005c5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3763ece7-c739-40ca-8e07-6dde1584ba85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a613141e-df34-49c4-9712-c3d232327d6b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=554f0c44-80f7-4a8b-84cb-85b70f4889a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.502 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 554f0c44-80f7-4a8b-84cb-85b70f4889a0 in datapath b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 unbound from our chassis#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.504 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.506 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5114fe05-7eea-4b0c-b2c6-a96beb481541]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.507 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 namespace which is not needed anymore#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 20 09:37:18 np0005588920 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000038.scope: Consumed 14.170s CPU time.
Jan 20 09:37:18 np0005588920 systemd-machined[196121]: Machine qemu-23-instance-00000038 terminated.
Jan 20 09:37:18 np0005588920 podman[247277]: 2026-01-20 14:37:18.627022611 +0000 UTC m=+0.122805054 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.655 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.659 226890 INFO nova.virt.libvirt.driver [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Instance destroyed successfully.#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.660 226890 DEBUG nova.objects.instance [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'resources' on Instance uuid f59628d0-8f85-42c2-93ff-a052df4ac20e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.688 226890 DEBUG nova.virt.libvirt.vif [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:36:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-36871268',display_name='tempest-ImagesOneServerNegativeTestJSON-server-36871268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-36871268',id=56,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:36:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-oo656c53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:37:15Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=f59628d0-8f85-42c2-93ff-a052df4ac20e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.689 226890 DEBUG nova.network.os_vif_util [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "address": "fa:16:3e:cd:86:1f", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap554f0c44-80", "ovs_interfaceid": "554f0c44-80f7-4a8b-84cb-85b70f4889a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.691 226890 DEBUG nova.network.os_vif_util [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.696 226890 DEBUG os_vif [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.698 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.699 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap554f0c44-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.701 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.702 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [NOTICE]   (246690) : haproxy version is 2.8.14-c23fe91
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [NOTICE]   (246690) : path to executable is /usr/sbin/haproxy
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [WARNING]  (246690) : Exiting Master process...
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.707 226890 INFO os_vif [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:86:1f,bridge_name='br-int',has_traffic_filtering=True,id=554f0c44-80f7-4a8b-84cb-85b70f4889a0,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap554f0c44-80')#033[00m
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [WARNING]  (246690) : Exiting Master process...
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [ALERT]    (246690) : Current worker (246692) exited with code 143 (Terminated)
Jan 20 09:37:18 np0005588920 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[246686]: [WARNING]  (246690) : All workers exited. Exiting... (0)
Jan 20 09:37:18 np0005588920 systemd[1]: libpod-d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8.scope: Deactivated successfully.
Jan 20 09:37:18 np0005588920 podman[247329]: 2026-01-20 14:37:18.717653262 +0000 UTC m=+0.070039565 container died d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:37:18 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8-userdata-shm.mount: Deactivated successfully.
Jan 20 09:37:18 np0005588920 systemd[1]: var-lib-containers-storage-overlay-53a2b302adf6a116746d9faa9c8f11e8958d034dd25562c4d4965c5ef2cff97d-merged.mount: Deactivated successfully.
Jan 20 09:37:18 np0005588920 podman[247329]: 2026-01-20 14:37:18.757915271 +0000 UTC m=+0.110301584 container cleanup d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.761 226890 DEBUG nova.compute.manager [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-unplugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.762 226890 DEBUG oslo_concurrency.lockutils [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.762 226890 DEBUG oslo_concurrency.lockutils [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.762 226890 DEBUG oslo_concurrency.lockutils [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.763 226890 DEBUG nova.compute.manager [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] No waiting events found dispatching network-vif-unplugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.763 226890 DEBUG nova.compute.manager [req-dcce0560-0dbe-4f73-b8f5-703afd6a39f9 req-3ee73261-5e66-43dc-9378-421725899b1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-unplugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:37:18 np0005588920 systemd[1]: libpod-conmon-d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8.scope: Deactivated successfully.
Jan 20 09:37:18 np0005588920 podman[247389]: 2026-01-20 14:37:18.828578522 +0000 UTC m=+0.042640876 container remove d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.835 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e1b591-264c-4a4e-97d7-4536df099517]: (4, ('Tue Jan 20 02:37:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 (d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8)\nd60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8\nTue Jan 20 02:37:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 (d60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8)\nd60cd84441c497571c437c747bcf00e3fe9edbdb12a0c8208d1a55f117b9b3b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.837 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6d226679-80cc-4dc8-b455-8fb396b9fc84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.838 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f372f9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.839 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 kernel: tapb1f372f9-f0: left promiscuous mode
Jan 20 09:37:18 np0005588920 nova_compute[226886]: 2026-01-20 14:37:18.854 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.857 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b76dfb-2725-41e8-b821-6e63e1a97084]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.876 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8adfa3b3-17ad-436d-bd68-4cb52083c04c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.877 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed80ebe-b815-404c-b314-656a543987cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:18.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.893 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[88c697a7-5403-436b-b0c4-6dc5e0c3c167]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485653, 'reachable_time': 16683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247404, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.895 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:37:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:18.895 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa7e7a2-8f72-4501-9247-8160d17d1bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:18 np0005588920 systemd[1]: run-netns-ovnmeta\x2db1f372f9\x2dfbd1\x2d4ef7\x2d9be7\x2dace7ce14bb23.mount: Deactivated successfully.
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.091 226890 INFO nova.virt.libvirt.driver [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Deleting instance files /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e_del#033[00m
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.092 226890 INFO nova.virt.libvirt.driver [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Deletion of /var/lib/nova/instances/f59628d0-8f85-42c2-93ff-a052df4ac20e_del complete#033[00m
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.170 226890 INFO nova.compute.manager [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.171 226890 DEBUG oslo.service.loopingcall [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.171 226890 DEBUG nova.compute.manager [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.171 226890 DEBUG nova.network.neutron [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:37:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:19.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:19 np0005588920 nova_compute[226886]: 2026-01-20 14:37:19.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.875 226890 DEBUG nova.compute.manager [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.876 226890 DEBUG oslo_concurrency.lockutils [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.876 226890 DEBUG oslo_concurrency.lockutils [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.877 226890 DEBUG oslo_concurrency.lockutils [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.877 226890 DEBUG nova.compute.manager [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] No waiting events found dispatching network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:20 np0005588920 nova_compute[226886]: 2026-01-20 14:37:20.877 226890 WARNING nova.compute.manager [req-075abec7-a45e-4499-aa61-4f720046f488 req-de2e6ac2-e69a-4fb3-bfc7-e987eed12b30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received unexpected event network-vif-plugged-554f0c44-80f7-4a8b-84cb-85b70f4889a0 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:37:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:20.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.072 226890 DEBUG nova.network.neutron [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.090 226890 INFO nova.compute.manager [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Took 1.92 seconds to deallocate network for instance.#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.141 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.142 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.188 226890 DEBUG nova.compute.manager [req-1ec953ec-4c3f-4ad0-a3ef-9e916baec12e req-f53212e3-ec92-4da5-aaa4-6781de92c9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Received event network-vif-deleted-554f0c44-80f7-4a8b-84cb-85b70f4889a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.199 226890 DEBUG oslo_concurrency.processutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:21.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1752266427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.664 226890 DEBUG oslo_concurrency.processutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.671 226890 DEBUG nova.compute.provider_tree [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.695 226890 DEBUG nova.scheduler.client.report [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.720 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.777 226890 INFO nova.scheduler.client.report [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Deleted allocations for instance f59628d0-8f85-42c2-93ff-a052df4ac20e#033[00m
Jan 20 09:37:21 np0005588920 nova_compute[226886]: 2026-01-20 14:37:21.884 226890 DEBUG oslo_concurrency.lockutils [None req-ff438d22-3b4f-4195-a7ed-fa1058192ab1 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "f59628d0-8f85-42c2-93ff-a052df4ac20e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:22 np0005588920 nova_compute[226886]: 2026-01-20 14:37:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:22 np0005588920 nova_compute[226886]: 2026-01-20 14:37:22.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:37:22 np0005588920 nova_compute[226886]: 2026-01-20 14:37:22.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:37:22 np0005588920 nova_compute[226886]: 2026-01-20 14:37:22.745 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:37:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 20 09:37:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:22.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:23.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:23 np0005588920 nova_compute[226886]: 2026-01-20 14:37:23.702 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:23 np0005588920 nova_compute[226886]: 2026-01-20 14:37:23.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:24 np0005588920 nova_compute[226886]: 2026-01-20 14:37:24.746 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:24.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:25.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:25 np0005588920 nova_compute[226886]: 2026-01-20 14:37:25.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:25 np0005588920 nova_compute[226886]: 2026-01-20 14:37:25.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:37:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:26 np0005588920 nova_compute[226886]: 2026-01-20 14:37:26.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:26.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 20 09:37:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:27.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:27 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 20 09:37:27 np0005588920 nova_compute[226886]: 2026-01-20 14:37:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.704 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.750 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:37:28 np0005588920 nova_compute[226886]: 2026-01-20 14:37:28.750 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:28.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/22471952' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.189 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 20 09:37:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:29.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.337 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.338 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4667MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.342 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.342 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.552 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.553 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.598 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:29 np0005588920 nova_compute[226886]: 2026-01-20 14:37:29.748 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1863201075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.006 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.011 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.024 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.043 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.044 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 20 09:37:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:30.533 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:30.534 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:37:30 np0005588920 nova_compute[226886]: 2026-01-20 14:37:30.625 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:30.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 20 09:37:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:31.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:31 np0005588920 podman[247473]: 2026-01-20 14:37:31.981076675 +0000 UTC m=+0.057840682 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:37:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:32.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:33.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:33 np0005588920 nova_compute[226886]: 2026-01-20 14:37:33.657 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919838.655878, f59628d0-8f85-42c2-93ff-a052df4ac20e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:33 np0005588920 nova_compute[226886]: 2026-01-20 14:37:33.658 226890 INFO nova.compute.manager [-] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:37:33 np0005588920 nova_compute[226886]: 2026-01-20 14:37:33.679 226890 DEBUG nova.compute.manager [None req-73b10a8a-aade-4a96-83fa-37695dbd0fef - - - - - -] [instance: f59628d0-8f85-42c2-93ff-a052df4ac20e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:33 np0005588920 nova_compute[226886]: 2026-01-20 14:37:33.706 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:34.536 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:34 np0005588920 nova_compute[226886]: 2026-01-20 14:37:34.751 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:34.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:34 np0005588920 nova_compute[226886]: 2026-01-20 14:37:34.943 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:34 np0005588920 nova_compute[226886]: 2026-01-20 14:37:34.944 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:34 np0005588920 nova_compute[226886]: 2026-01-20 14:37:34.970 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.054 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.054 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.062 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.062 226890 INFO nova.compute.claims [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.202 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:35.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/471447498' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.605 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.610 226890 DEBUG nova.compute.provider_tree [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.623 226890 DEBUG nova.scheduler.client.report [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.644 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.644 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.699 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.700 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.717 226890 INFO nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.737 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.857 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.859 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.860 226890 INFO nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Creating image(s)#033[00m
Jan 20 09:37:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.891 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.922 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.951 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:35 np0005588920 nova_compute[226886]: 2026-01-20 14:37:35.954 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.012 226890 DEBUG nova.policy [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd640d75528bb42fdb7a5516f002216f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52415b3ad5fa4b42885bd73bc38ea3ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.033 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.033 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.034 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.034 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.057 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.060 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fe3803fb-61eb-4f86-98b1-b21728174f6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.382 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fe3803fb-61eb-4f86-98b1-b21728174f6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.455 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] resizing rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.564 226890 DEBUG nova.objects.instance [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lazy-loading 'migration_context' on Instance uuid fe3803fb-61eb-4f86-98b1-b21728174f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.599 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.600 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Ensure instance console log exists: /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.600 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.600 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.601 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:36.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:36 np0005588920 nova_compute[226886]: 2026-01-20 14:37:36.987 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Successfully created port: 2a415699-a6bd-4fe9-b90f-22feaa382913 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:37:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:37.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.213 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Successfully updated port: 2a415699-a6bd-4fe9-b90f-22feaa382913 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.484 226890 DEBUG nova.compute.manager [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-changed-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.484 226890 DEBUG nova.compute.manager [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Refreshing instance network info cache due to event network-changed-2a415699-a6bd-4fe9-b90f-22feaa382913. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.485 226890 DEBUG oslo_concurrency.lockutils [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.485 226890 DEBUG oslo_concurrency.lockutils [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.486 226890 DEBUG nova.network.neutron [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Refreshing network info cache for port 2a415699-a6bd-4fe9-b90f-22feaa382913 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.490 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:37:38 np0005588920 nova_compute[226886]: 2026-01-20 14:37:38.708 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:38.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.193 226890 DEBUG nova.network.neutron [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:37:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:39.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.644 226890 DEBUG nova.network.neutron [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.718 226890 DEBUG oslo_concurrency.lockutils [req-731c8b4e-974b-445d-9cf0-1257022bfaa8 req-b11806c8-f57b-4e60-87c4-2110d94eea30 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.718 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquired lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.719 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:37:39 np0005588920 nova_compute[226886]: 2026-01-20 14:37:39.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:40 np0005588920 nova_compute[226886]: 2026-01-20 14:37:40.320 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:37:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:40.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:41.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.486 226890 DEBUG nova.network.neutron [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updating instance_info_cache with network_info: [{"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.586 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Releasing lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.586 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance network_info: |[{"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.589 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Start _get_guest_xml network_info=[{"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.593 226890 WARNING nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.602 226890 DEBUG nova.virt.libvirt.host [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.602 226890 DEBUG nova.virt.libvirt.host [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.605 226890 DEBUG nova.virt.libvirt.host [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.605 226890 DEBUG nova.virt.libvirt.host [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.606 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.607 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.608 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.608 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.608 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.608 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.608 226890 DEBUG nova.virt.hardware [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:37:41 np0005588920 nova_compute[226886]: 2026-01-20 14:37:41.611 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:37:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3670143803' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.061 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.094 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.098 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:37:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3145099478' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.540 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.541 226890 DEBUG nova.virt.libvirt.vif [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-836808259',display_name='tempest-ServersTestJSON-server-836808259',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-836808259',id=59,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF002s4g7NfltVTy/5qA7CIpXy9b6oAWF5Gh3G1MX2EkwnoUde50AHBSQaoFohXYmoz2yQYkOYpMmvcdh9PChiZg7hQh3VovUDI9db2jvTE3ZKjWnX88VE/aySam1zC5hg==',key_name='tempest-keypair-1007301700',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52415b3ad5fa4b42885bd73bc38ea3ad',ramdisk_id='',reservation_id='r-xvgzkwgm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-631723368',owner_user_name='tempest-ServersTestJSON-631723368-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:37:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d640d75528bb42fdb7a5516f002216f2',uuid=fe3803fb-61eb-4f86-98b1-b21728174f6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.542 226890 DEBUG nova.network.os_vif_util [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converting VIF {"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.542 226890 DEBUG nova.network.os_vif_util [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.544 226890 DEBUG nova.objects.instance [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lazy-loading 'pci_devices' on Instance uuid fe3803fb-61eb-4f86-98b1-b21728174f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.571 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <uuid>fe3803fb-61eb-4f86-98b1-b21728174f6a</uuid>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <name>instance-0000003b</name>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersTestJSON-server-836808259</nova:name>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:37:41</nova:creationTime>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:user uuid="d640d75528bb42fdb7a5516f002216f2">tempest-ServersTestJSON-631723368-project-member</nova:user>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:project uuid="52415b3ad5fa4b42885bd73bc38ea3ad">tempest-ServersTestJSON-631723368</nova:project>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <nova:port uuid="2a415699-a6bd-4fe9-b90f-22feaa382913">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="serial">fe3803fb-61eb-4f86-98b1-b21728174f6a</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="uuid">fe3803fb-61eb-4f86-98b1-b21728174f6a</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/fe3803fb-61eb-4f86-98b1-b21728174f6a_disk">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:df:1f:18"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <target dev="tap2a415699-a6"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/console.log" append="off"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:37:42 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:37:42 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:37:42 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:37:42 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.571 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Preparing to wait for external event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.571 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.572 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.572 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.572 226890 DEBUG nova.virt.libvirt.vif [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-836808259',display_name='tempest-ServersTestJSON-server-836808259',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-836808259',id=59,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF002s4g7NfltVTy/5qA7CIpXy9b6oAWF5Gh3G1MX2EkwnoUde50AHBSQaoFohXYmoz2yQYkOYpMmvcdh9PChiZg7hQh3VovUDI9db2jvTE3ZKjWnX88VE/aySam1zC5hg==',key_name='tempest-keypair-1007301700',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52415b3ad5fa4b42885bd73bc38ea3ad',ramdisk_id='',reservation_id='r-xvgzkwgm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-631723368',owner_user_name='tempest-ServersTestJSON-631723368-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:37:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d640d75528bb42fdb7a5516f002216f2',uuid=fe3803fb-61eb-4f86-98b1-b21728174f6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.573 226890 DEBUG nova.network.os_vif_util [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converting VIF {"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.573 226890 DEBUG nova.network.os_vif_util [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.573 226890 DEBUG os_vif [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.574 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.574 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.574 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.576 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.576 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a415699-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.577 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a415699-a6, col_values=(('external_ids', {'iface-id': '2a415699-a6bd-4fe9-b90f-22feaa382913', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:1f:18', 'vm-uuid': 'fe3803fb-61eb-4f86-98b1-b21728174f6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:42 np0005588920 NetworkManager[49076]: <info>  [1768919862.5791] manager: (tap2a415699-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.585 226890 INFO os_vif [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6')#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.827 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.827 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.827 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] No VIF found with MAC fa:16:3e:df:1f:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.828 226890 INFO nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Using config drive#033[00m
Jan 20 09:37:42 np0005588920 nova_compute[226886]: 2026-01-20 14:37:42.846 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:42.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.277 226890 INFO nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Creating config drive at /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.282 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2nsjttq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:43.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.412 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2nsjttq" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.448 226890 DEBUG nova.storage.rbd_utils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] rbd image fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.454 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.714 226890 DEBUG oslo_concurrency.processutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config fe3803fb-61eb-4f86-98b1-b21728174f6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.715 226890 INFO nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Deleting local config drive /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a/disk.config because it was imported into RBD.#033[00m
Jan 20 09:37:43 np0005588920 NetworkManager[49076]: <info>  [1768919863.7603] manager: (tap2a415699-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 20 09:37:43 np0005588920 systemd-udevd[247809]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:37:43 np0005588920 systemd-machined[196121]: New machine qemu-25-instance-0000003b.
Jan 20 09:37:43 np0005588920 kernel: tap2a415699-a6: entered promiscuous mode
Jan 20 09:37:43 np0005588920 NetworkManager[49076]: <info>  [1768919863.8323] device (tap2a415699-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:37:43 np0005588920 NetworkManager[49076]: <info>  [1768919863.8333] device (tap2a415699-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.834 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:43Z|00153|binding|INFO|Claiming lport 2a415699-a6bd-4fe9-b90f-22feaa382913 for this chassis.
Jan 20 09:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:43Z|00154|binding|INFO|2a415699-a6bd-4fe9-b90f-22feaa382913: Claiming fa:16:3e:df:1f:18 10.100.0.6
Jan 20 09:37:43 np0005588920 systemd[1]: Started Virtual Machine qemu-25-instance-0000003b.
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.842 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:1f:18 10.100.0.6'], port_security=['fa:16:3e:df:1f:18 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fe3803fb-61eb-4f86-98b1-b21728174f6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '52415b3ad5fa4b42885bd73bc38ea3ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f0d5567-d313-4d91-842c-185bd82081fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea8668d-5daa-4d78-b9b2-f4c3103dc0be, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2a415699-a6bd-4fe9-b90f-22feaa382913) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.843 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2a415699-a6bd-4fe9-b90f-22feaa382913 in datapath bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 bound to our chassis#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.844 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1#033[00m
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.863 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2687d4c-4596-4f94-92d7-f8dc3132a15e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.863 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbb9c4ba4-71 in ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.865 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbb9c4ba4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.866 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d00424a-db32-4f3b-b0f1-e424886d0254]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.866 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4316a6-0e69-4b65-a069-48f225367622]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.884 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a594d708-8f06-46e3-8128-5417308dc29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.903 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b13ca0-2e52-41be-bdc6-55daf5c1ec88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:43Z|00155|binding|INFO|Setting lport 2a415699-a6bd-4fe9-b90f-22feaa382913 ovn-installed in OVS
Jan 20 09:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:43Z|00156|binding|INFO|Setting lport 2a415699-a6bd-4fe9-b90f-22feaa382913 up in Southbound
Jan 20 09:37:43 np0005588920 nova_compute[226886]: 2026-01-20 14:37:43.909 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.933 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[399e1f6b-eda2-4706-a6c4-0814d8e5cbae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 NetworkManager[49076]: <info>  [1768919863.9393] manager: (tapbb9c4ba4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.938 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2daba46-006e-40ee-83d6-6a2b1375b3a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.968 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e6659447-ecc3-49d7-94f8-840d451b37f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.971 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f63f77-7f9e-47c9-a4a3-a13ca16f0400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588920 NetworkManager[49076]: <info>  [1768919863.9914] device (tapbb9c4ba4-70): carrier: link connected
Jan 20 09:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:43.997 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f52965-fd10-495f-a546-824d58641ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.013 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82cf4f43-c9a8-4b77-8bfe-3d7dc8a15127]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbb9c4ba4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:01:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490567, 'reachable_time': 20514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247846, 'error': None, 'target': 'ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.028 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[03702a6e-5a06-478c-b987-055a187a88be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:1b7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490567, 'tstamp': 490567}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247847, 'error': None, 'target': 'ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.045 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5dca70f2-1a9a-4a3c-b0e5-7d8236da1e9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbb9c4ba4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:01:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490567, 'reachable_time': 20514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247848, 'error': None, 'target': 'ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.073 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8896e215-e857-40a1-950d-01a66e0c541d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.127 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bddc5c8d-1f47-43b2-8d8d-d30efacd3773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.128 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb9c4ba4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.129 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.129 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb9c4ba4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:44 np0005588920 kernel: tapbb9c4ba4-70: entered promiscuous mode
Jan 20 09:37:44 np0005588920 NetworkManager[49076]: <info>  [1768919864.1311] manager: (tapbb9c4ba4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.130 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.134 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbb9c4ba4-70, col_values=(('external_ids', {'iface-id': '299efc3d-8746-45fd-93e1-2cb0aef5840e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:44Z|00157|binding|INFO|Releasing lport 299efc3d-8746-45fd-93e1-2cb0aef5840e from this chassis (sb_readonly=0)
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.148 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.149 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[41662f40-939c-45b9-ab53-c1b808cf22a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.150 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1.pid.haproxy
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:37:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:37:44.150 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'env', 'PROCESS_TAG=haproxy-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.477 226890 DEBUG nova.compute.manager [req-873ec68c-10b9-4a8f-9032-0c0055702cc0 req-2d9c37b8-3fa7-4c66-bedf-21b01c5661d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.478 226890 DEBUG oslo_concurrency.lockutils [req-873ec68c-10b9-4a8f-9032-0c0055702cc0 req-2d9c37b8-3fa7-4c66-bedf-21b01c5661d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.478 226890 DEBUG oslo_concurrency.lockutils [req-873ec68c-10b9-4a8f-9032-0c0055702cc0 req-2d9c37b8-3fa7-4c66-bedf-21b01c5661d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.478 226890 DEBUG oslo_concurrency.lockutils [req-873ec68c-10b9-4a8f-9032-0c0055702cc0 req-2d9c37b8-3fa7-4c66-bedf-21b01c5661d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.479 226890 DEBUG nova.compute.manager [req-873ec68c-10b9-4a8f-9032-0c0055702cc0 req-2d9c37b8-3fa7-4c66-bedf-21b01c5661d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Processing event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.493 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.494 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919864.4929552, fe3803fb-61eb-4f86-98b1-b21728174f6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.495 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.498 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.501 226890 INFO nova.virt.libvirt.driver [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance spawned successfully.#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.501 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.523 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.529 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.531 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.532 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.532 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.533 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.533 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.534 226890 DEBUG nova.virt.libvirt.driver [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:44 np0005588920 podman[247922]: 2026-01-20 14:37:44.546095749 +0000 UTC m=+0.054481599 container create 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.573 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.574 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919864.493827, fe3803fb-61eb-4f86-98b1-b21728174f6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.574 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:37:44 np0005588920 systemd[1]: Started libpod-conmon-2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485.scope.
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.600 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.603 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919864.4970639, fe3803fb-61eb-4f86-98b1-b21728174f6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.603 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:37:44 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:37:44 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ea311ad2d69b6ac5665944112ef94c5af624393f5c80e0772c445b98a7200cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:37:44 np0005588920 podman[247922]: 2026-01-20 14:37:44.518285529 +0000 UTC m=+0.026671399 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.622 226890 INFO nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Took 8.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.622 226890 DEBUG nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.625 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:44 np0005588920 podman[247922]: 2026-01-20 14:37:44.626115712 +0000 UTC m=+0.134501582 container init 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:37:44 np0005588920 podman[247922]: 2026-01-20 14:37:44.632212893 +0000 UTC m=+0.140598743 container start 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.632 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:37:44 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [NOTICE]   (247941) : New worker (247943) forked
Jan 20 09:37:44 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [NOTICE]   (247941) : Loading success.
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.658 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.688 226890 INFO nova.compute.manager [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Took 9.67 seconds to build instance.#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.729 226890 DEBUG oslo_concurrency.lockutils [None req-a75012ae-7bed-4ead-9056-bd3ae3e5ceb3 d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:44 np0005588920 nova_compute[226886]: 2026-01-20 14:37:44.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:44.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:45.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.774 226890 DEBUG nova.compute.manager [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.776 226890 DEBUG oslo_concurrency.lockutils [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.777 226890 DEBUG oslo_concurrency.lockutils [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.778 226890 DEBUG oslo_concurrency.lockutils [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.778 226890 DEBUG nova.compute.manager [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] No waiting events found dispatching network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:46 np0005588920 nova_compute[226886]: 2026-01-20 14:37:46.779 226890 WARNING nova.compute.manager [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received unexpected event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:37:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:46.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:47.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:47 np0005588920 nova_compute[226886]: 2026-01-20 14:37:47.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:47 np0005588920 nova_compute[226886]: 2026-01-20 14:37:47.912 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:47 np0005588920 NetworkManager[49076]: <info>  [1768919867.9164] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 20 09:37:47 np0005588920 NetworkManager[49076]: <info>  [1768919867.9181] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 20 09:37:48 np0005588920 nova_compute[226886]: 2026-01-20 14:37:48.056 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:48Z|00158|binding|INFO|Releasing lport 299efc3d-8746-45fd-93e1-2cb0aef5840e from this chassis (sb_readonly=0)
Jan 20 09:37:48 np0005588920 nova_compute[226886]: 2026-01-20 14:37:48.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:48.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:49 np0005588920 podman[247953]: 2026-01-20 14:37:49.02031628 +0000 UTC m=+0.097658138 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:37:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:49.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.863 226890 DEBUG nova.compute.manager [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-changed-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.864 226890 DEBUG nova.compute.manager [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Refreshing instance network info cache due to event network-changed-2a415699-a6bd-4fe9-b90f-22feaa382913. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.865 226890 DEBUG oslo_concurrency.lockutils [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.865 226890 DEBUG oslo_concurrency.lockutils [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:37:49 np0005588920 nova_compute[226886]: 2026-01-20 14:37:49.866 226890 DEBUG nova.network.neutron [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Refreshing network info cache for port 2a415699-a6bd-4fe9-b90f-22feaa382913 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:37:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:50.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:51.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:37:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6903 writes, 35K keys, 6903 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6902 writes, 6902 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1812 writes, 8927 keys, 1812 commit groups, 1.0 writes per commit group, ingest: 17.39 MB, 0.03 MB/s#012Interval WAL: 1811 writes, 1811 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     68.8      0.58              0.18        18    0.032       0      0       0.0       0.0#012  L6      1/0    9.71 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6    110.9     91.6      1.56              0.56        17    0.092     86K   9394       0.0       0.0#012 Sum      1/0    9.71 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.6     80.9     85.4      2.14              0.74        35    0.061     86K   9394       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.8     91.7     93.7      0.52              0.19         8    0.065     24K   2592       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    110.9     91.6      1.56              0.56        17    0.092     86K   9394       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     69.0      0.58              0.18        17    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.039, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.18 GB write, 0.08 MB/s write, 0.17 GB read, 0.07 MB/s read, 2.1 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 19.97 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000191 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1151,19.29 MB,6.346%) FilterBlock(35,250.05 KB,0.0803245%) IndexBlock(35,444.61 KB,0.142825%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:37:52 np0005588920 nova_compute[226886]: 2026-01-20 14:37:52.580 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:52.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:53 np0005588920 nova_compute[226886]: 2026-01-20 14:37:53.250 226890 DEBUG nova.network.neutron [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updated VIF entry in instance network info cache for port 2a415699-a6bd-4fe9-b90f-22feaa382913. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:37:53 np0005588920 nova_compute[226886]: 2026-01-20 14:37:53.252 226890 DEBUG nova.network.neutron [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updating instance_info_cache with network_info: [{"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:53.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:53 np0005588920 nova_compute[226886]: 2026-01-20 14:37:53.422 226890 DEBUG oslo_concurrency.lockutils [req-97b063ab-1b82-410a-8a31-d2642bad40db req-5b875e12-3e8e-4969-b03e-7ab2a94f9e8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:37:54 np0005588920 nova_compute[226886]: 2026-01-20 14:37:54.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:54.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:56.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:57 np0005588920 ovn_controller[133971]: 2026-01-20T14:37:57Z|00159|binding|INFO|Releasing lport 299efc3d-8746-45fd-93e1-2cb0aef5840e from this chassis (sb_readonly=0)
Jan 20 09:37:57 np0005588920 nova_compute[226886]: 2026-01-20 14:37:57.253 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:57 np0005588920 nova_compute[226886]: 2026-01-20 14:37:57.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:58 np0005588920 nova_compute[226886]: 2026-01-20 14:37:58.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:58.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:37:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:59 np0005588920 nova_compute[226886]: 2026-01-20 14:37:59.788 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:00.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:01Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:1f:18 10.100.0.6
Jan 20 09:38:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:01Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:1f:18 10.100.0.6
Jan 20 09:38:02 np0005588920 nova_compute[226886]: 2026-01-20 14:38:02.583 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:02.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:03 np0005588920 podman[247980]: 2026-01-20 14:38:03.002765825 +0000 UTC m=+0.076717762 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:38:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:03.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:04 np0005588920 nova_compute[226886]: 2026-01-20 14:38:04.606 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:04 np0005588920 nova_compute[226886]: 2026-01-20 14:38:04.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:04.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:05.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:06.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:07.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:07 np0005588920 nova_compute[226886]: 2026-01-20 14:38:07.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:08.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:09 np0005588920 nova_compute[226886]: 2026-01-20 14:38:09.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:11.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:12 np0005588920 nova_compute[226886]: 2026-01-20 14:38:12.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:12 np0005588920 nova_compute[226886]: 2026-01-20 14:38:12.702 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:12.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:13.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:14 np0005588920 nova_compute[226886]: 2026-01-20 14:38:14.849 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:15.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:38:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:15 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:38:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:16.441 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:16.442 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:16.442 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:16.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:17.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:17 np0005588920 nova_compute[226886]: 2026-01-20 14:38:17.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:18 np0005588920 nova_compute[226886]: 2026-01-20 14:38:18.205 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:18Z|00160|binding|INFO|Releasing lport 299efc3d-8746-45fd-93e1-2cb0aef5840e from this chassis (sb_readonly=0)
Jan 20 09:38:18 np0005588920 nova_compute[226886]: 2026-01-20 14:38:18.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:18.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:19.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:19 np0005588920 nova_compute[226886]: 2026-01-20 14:38:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:19 np0005588920 nova_compute[226886]: 2026-01-20 14:38:19.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:38:19 np0005588920 nova_compute[226886]: 2026-01-20 14:38:19.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:20 np0005588920 podman[248132]: 2026-01-20 14:38:20.062277766 +0000 UTC m=+0.126788746 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 20 09:38:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:21 np0005588920 nova_compute[226886]: 2026-01-20 14:38:21.750 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.592 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.928 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.929 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.929 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:38:22 np0005588920 nova_compute[226886]: 2026-01-20 14:38:22.930 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fe3803fb-61eb-4f86-98b1-b21728174f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:38:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:22.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:23 np0005588920 nova_compute[226886]: 2026-01-20 14:38:23.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:23.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:38:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 25K writes, 101K keys, 25K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.04 MB/s#012Cumulative WAL: 25K writes, 8665 syncs, 2.92 writes per sync, written: 0.10 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 51K keys, 12K commit groups, 1.0 writes per commit group, ingest: 51.44 MB, 0.09 MB/s#012Interval WAL: 12K writes, 5047 syncs, 2.57 writes per sync, written: 0.05 GB, 0.09 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.544 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.544 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.545 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.546 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.546 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.548 226890 INFO nova.compute.manager [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Terminating instance#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.550 226890 DEBUG nova.compute.manager [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:38:24 np0005588920 kernel: tap2a415699-a6 (unregistering): left promiscuous mode
Jan 20 09:38:24 np0005588920 NetworkManager[49076]: <info>  [1768919904.6051] device (tap2a415699-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:38:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:24Z|00161|binding|INFO|Releasing lport 2a415699-a6bd-4fe9-b90f-22feaa382913 from this chassis (sb_readonly=0)
Jan 20 09:38:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:24Z|00162|binding|INFO|Setting lport 2a415699-a6bd-4fe9-b90f-22feaa382913 down in Southbound
Jan 20 09:38:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:24Z|00163|binding|INFO|Removing iface tap2a415699-a6 ovn-installed in OVS
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.619 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.625 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:1f:18 10.100.0.6'], port_security=['fa:16:3e:df:1f:18 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fe3803fb-61eb-4f86-98b1-b21728174f6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '52415b3ad5fa4b42885bd73bc38ea3ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0f0d5567-d313-4d91-842c-185bd82081fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.237'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea8668d-5daa-4d78-b9b2-f4c3103dc0be, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2a415699-a6bd-4fe9-b90f-22feaa382913) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.627 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2a415699-a6bd-4fe9-b90f-22feaa382913 in datapath bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 unbound from our chassis#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.628 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.629 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d3cd7482-e3ab-4e7c-a717-2e52bfc721ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.629 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 namespace which is not needed anymore#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.654 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Jan 20 09:38:24 np0005588920 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003b.scope: Consumed 14.082s CPU time.
Jan 20 09:38:24 np0005588920 systemd-machined[196121]: Machine qemu-25-instance-0000003b terminated.
Jan 20 09:38:24 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [NOTICE]   (247941) : haproxy version is 2.8.14-c23fe91
Jan 20 09:38:24 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [NOTICE]   (247941) : path to executable is /usr/sbin/haproxy
Jan 20 09:38:24 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [WARNING]  (247941) : Exiting Master process...
Jan 20 09:38:24 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [ALERT]    (247941) : Current worker (247943) exited with code 143 (Terminated)
Jan 20 09:38:24 np0005588920 neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1[247937]: [WARNING]  (247941) : All workers exited. Exiting... (0)
Jan 20 09:38:24 np0005588920 systemd[1]: libpod-2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485.scope: Deactivated successfully.
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.775 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.783 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 podman[248231]: 2026-01-20 14:38:24.785121599 +0000 UTC m=+0.057419981 container died 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.796 226890 INFO nova.virt.libvirt.driver [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Instance destroyed successfully.#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.797 226890 DEBUG nova.objects.instance [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lazy-loading 'resources' on Instance uuid fe3803fb-61eb-4f86-98b1-b21728174f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.812 226890 DEBUG nova.virt.libvirt.vif [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:37:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-836808259',display_name='tempest-ServersTestJSON-server-836808259',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-836808259',id=59,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF002s4g7NfltVTy/5qA7CIpXy9b6oAWF5Gh3G1MX2EkwnoUde50AHBSQaoFohXYmoz2yQYkOYpMmvcdh9PChiZg7hQh3VovUDI9db2jvTE3ZKjWnX88VE/aySam1zC5hg==',key_name='tempest-keypair-1007301700',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:37:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='52415b3ad5fa4b42885bd73bc38ea3ad',ramdisk_id='',reservation_id='r-xvgzkwgm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-631723368',owner_user_name='tempest-ServersTestJSON-631723368-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:37:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d640d75528bb42fdb7a5516f002216f2',uuid=fe3803fb-61eb-4f86-98b1-b21728174f6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.813 226890 DEBUG nova.network.os_vif_util [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converting VIF {"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.814 226890 DEBUG nova.network.os_vif_util [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.814 226890 DEBUG os_vif [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.817 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.817 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a415699-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.818 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485-userdata-shm.mount: Deactivated successfully.
Jan 20 09:38:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay-1ea311ad2d69b6ac5665944112ef94c5af624393f5c80e0772c445b98a7200cc-merged.mount: Deactivated successfully.
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.825 226890 INFO os_vif [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:1f:18,bridge_name='br-int',has_traffic_filtering=True,id=2a415699-a6bd-4fe9-b90f-22feaa382913,network=Network(bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a415699-a6')#033[00m
Jan 20 09:38:24 np0005588920 podman[248231]: 2026-01-20 14:38:24.835358797 +0000 UTC m=+0.107657159 container cleanup 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:38:24 np0005588920 systemd[1]: libpod-conmon-2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485.scope: Deactivated successfully.
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.888 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updating instance_info_cache with network_info: [{"id": "2a415699-a6bd-4fe9-b90f-22feaa382913", "address": "fa:16:3e:df:1f:18", "network": {"id": "bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1", "bridge": "br-int", "label": "tempest-ServersTestJSON-396551132-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "52415b3ad5fa4b42885bd73bc38ea3ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a415699-a6", "ovs_interfaceid": "2a415699-a6bd-4fe9-b90f-22feaa382913", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:38:24 np0005588920 podman[248285]: 2026-01-20 14:38:24.907903191 +0000 UTC m=+0.048895592 container remove 2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.920 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-fe3803fb-61eb-4f86-98b1-b21728174f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.921 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.921 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.940 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b624a9-9d38-4476-9d27-6f3e8bd2184e]: (4, ('Tue Jan 20 02:38:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 (2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485)\n2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485\nTue Jan 20 02:38:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 (2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485)\n2ec9a5cd7e36a229de8f313fc5b21546a9def47d2735da73132f72d7424c8485\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.942 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ed39f6-8c56-4155-a0e5-5ee9554accca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.943 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb9c4ba4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:24 np0005588920 kernel: tapbb9c4ba4-70: left promiscuous mode
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.944 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.949 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6b179fd0-3002-4a11-926b-4423c41dccaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 nova_compute[226886]: 2026-01-20 14:38:24.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.966 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[07e16bb5-b7c5-4672-b70d-240797e65910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.967 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0844ce52-243d-40ab-9930-87b82c05c76f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.986 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[baf913d3-b119-4f3b-b015-2c5d0bc81a39]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490561, 'reachable_time': 21661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248303, 'error': None, 'target': 'ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.989 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bb9c4ba4-7d0f-4a0b-8588-dd53a55460d1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:38:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:24.989 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5f1aa8-d0d7-4395-aaa9-1f8a8f6c7b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:24 np0005588920 systemd[1]: run-netns-ovnmeta\x2dbb9c4ba4\x2d7d0f\x2d4a0b\x2d8588\x2ddd53a55460d1.mount: Deactivated successfully.
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.080 226890 DEBUG nova.compute.manager [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-unplugged-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.080 226890 DEBUG oslo_concurrency.lockutils [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.081 226890 DEBUG oslo_concurrency.lockutils [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.081 226890 DEBUG oslo_concurrency.lockutils [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.081 226890 DEBUG nova.compute.manager [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] No waiting events found dispatching network-vif-unplugged-2a415699-a6bd-4fe9-b90f-22feaa382913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.082 226890 DEBUG nova.compute.manager [req-9d049650-2947-4b01-8ffe-39d52a11aad3 req-5d70f9d1-00f8-4e23-8ff0-11819440412c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-unplugged-2a415699-a6bd-4fe9-b90f-22feaa382913 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.278 226890 INFO nova.virt.libvirt.driver [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Deleting instance files /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a_del#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.279 226890 INFO nova.virt.libvirt.driver [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Deletion of /var/lib/nova/instances/fe3803fb-61eb-4f86-98b1-b21728174f6a_del complete#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.339 226890 INFO nova.compute.manager [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.340 226890 DEBUG oslo.service.loopingcall [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.340 226890 DEBUG nova.compute.manager [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.340 226890 DEBUG nova.network.neutron [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:38:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:25.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.734 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:25 np0005588920 nova_compute[226886]: 2026-01-20 14:38:25.764 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:26 np0005588920 nova_compute[226886]: 2026-01-20 14:38:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:26 np0005588920 nova_compute[226886]: 2026-01-20 14:38:26.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:38:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:26.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.214 226890 DEBUG nova.compute.manager [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.214 226890 DEBUG oslo_concurrency.lockutils [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.214 226890 DEBUG oslo_concurrency.lockutils [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.215 226890 DEBUG oslo_concurrency.lockutils [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.215 226890 DEBUG nova.compute.manager [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] No waiting events found dispatching network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.215 226890 WARNING nova.compute.manager [req-afb73b6e-44a8-4852-bebd-656d4455235b req-0b6f705a-48ef-46f2-b7f2-42b5b836159f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received unexpected event network-vif-plugged-2a415699-a6bd-4fe9-b90f-22feaa382913 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.277 226890 DEBUG nova.network.neutron [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.305 226890 INFO nova.compute.manager [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Took 1.96 seconds to deallocate network for instance.#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.365 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.366 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:27.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.464 226890 DEBUG nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.560 226890 DEBUG nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.560 226890 DEBUG nova.compute.provider_tree [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.741 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.775 226890 DEBUG nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.795 226890 DEBUG nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.831 226890 DEBUG oslo_concurrency.processutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:27 np0005588920 nova_compute[226886]: 2026-01-20 14:38:27.956 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.014 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/297649950' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.282 226890 DEBUG oslo_concurrency.processutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.288 226890 DEBUG nova.compute.provider_tree [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.308 226890 DEBUG nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.329 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.358 226890 INFO nova.scheduler.client.report [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Deleted allocations for instance fe3803fb-61eb-4f86-98b1-b21728174f6a#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.433 226890 DEBUG oslo_concurrency.lockutils [None req-de680c60-4e0c-40e8-9bb8-de40976b36af d640d75528bb42fdb7a5516f002216f2 52415b3ad5fa4b42885bd73bc38ea3ad - - default default] Lock "fe3803fb-61eb-4f86-98b1-b21728174f6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.741 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.742 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.775 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.776 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.776 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.776 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:38:28 np0005588920 nova_compute[226886]: 2026-01-20 14:38:28.777 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:38:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:28.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:38:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/416411339' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.301 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.398 226890 DEBUG nova.compute.manager [req-cbf8508f-1809-45d9-bc18-4c77030b770b req-f3c01d68-be96-491e-b6a7-42f56f3e5957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Received event network-vif-deleted-2a415699-a6bd-4fe9-b90f-22feaa382913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:29.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.500 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.501 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4682MB free_disk=20.959110260009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.501 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.501 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.546 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.547 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.567 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.939 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3805276064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.990 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:29 np0005588920 nova_compute[226886]: 2026-01-20 14:38:29.997 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:38:30 np0005588920 nova_compute[226886]: 2026-01-20 14:38:30.011 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:38:30 np0005588920 nova_compute[226886]: 2026-01-20 14:38:30.041 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:38:30 np0005588920 nova_compute[226886]: 2026-01-20 14:38:30.041 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:30.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:31.004 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:38:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:31.005 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:38:31 np0005588920 nova_compute[226886]: 2026-01-20 14:38:31.024 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:31 np0005588920 nova_compute[226886]: 2026-01-20 14:38:31.025 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:31 np0005588920 nova_compute[226886]: 2026-01-20 14:38:31.025 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:31 np0005588920 nova_compute[226886]: 2026-01-20 14:38:31.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:31.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:32.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:33 np0005588920 podman[248373]: 2026-01-20 14:38:33.254535212 +0000 UTC m=+0.074851737 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:38:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:33.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:34 np0005588920 nova_compute[226886]: 2026-01-20 14:38:34.823 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:34.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:34 np0005588920 nova_compute[226886]: 2026-01-20 14:38:34.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:36.006 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:36.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:37.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:38.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:39.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:39 np0005588920 nova_compute[226886]: 2026-01-20 14:38:39.795 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919904.7946415, fe3803fb-61eb-4f86-98b1-b21728174f6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:38:39 np0005588920 nova_compute[226886]: 2026-01-20 14:38:39.796 226890 INFO nova.compute.manager [-] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:38:39 np0005588920 nova_compute[226886]: 2026-01-20 14:38:39.814 226890 DEBUG nova.compute.manager [None req-b2dcd94e-96c5-4f50-8bf9-61ac81b6a645 - - - - - -] [instance: fe3803fb-61eb-4f86-98b1-b21728174f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:38:39 np0005588920 nova_compute[226886]: 2026-01-20 14:38:39.827 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:39 np0005588920 nova_compute[226886]: 2026-01-20 14:38:39.994 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:40.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:41.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:42.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:42 np0005588920 nova_compute[226886]: 2026-01-20 14:38:42.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:44 np0005588920 nova_compute[226886]: 2026-01-20 14:38:44.831 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:45 np0005588920 nova_compute[226886]: 2026-01-20 14:38:45.043 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:45.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:46.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:47.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.887808) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927887916, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2829, "num_deletes": 526, "total_data_size": 5727781, "memory_usage": 5817376, "flush_reason": "Manual Compaction"}
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927926490, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 3737600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33445, "largest_seqno": 36269, "table_properties": {"data_size": 3726364, "index_size": 6834, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 26949, "raw_average_key_size": 20, "raw_value_size": 3701618, "raw_average_value_size": 2791, "num_data_blocks": 296, "num_entries": 1326, "num_filter_entries": 1326, "num_deletions": 526, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919747, "oldest_key_time": 1768919747, "file_creation_time": 1768919927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 38723 microseconds, and 10519 cpu microseconds.
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.926534) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 3737600 bytes OK
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.926550) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928233) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928249) EVENT_LOG_v1 {"time_micros": 1768919927928244, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928268) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 5714208, prev total WAL file size 5714208, number of live WAL files 2.
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.929651) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(3650KB)], [63(9941KB)]
Jan 20 09:38:47 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927929795, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 13918171, "oldest_snapshot_seqno": -1}
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6237 keys, 11936372 bytes, temperature: kUnknown
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928064533, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 11936372, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11891041, "index_size": 28645, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 159248, "raw_average_key_size": 25, "raw_value_size": 11775381, "raw_average_value_size": 1887, "num_data_blocks": 1154, "num_entries": 6237, "num_filter_entries": 6237, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.064949) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 11936372 bytes
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066512) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.2 rd, 88.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.7 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(6.9) write-amplify(3.2) OK, records in: 7293, records dropped: 1056 output_compression: NoCompression
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066542) EVENT_LOG_v1 {"time_micros": 1768919928066528, "job": 38, "event": "compaction_finished", "compaction_time_micros": 134917, "compaction_time_cpu_micros": 42284, "output_level": 6, "num_output_files": 1, "total_output_size": 11936372, "num_input_records": 7293, "num_output_records": 6237, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928068392, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928070771, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:47.929494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.070880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.070886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.070887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.070889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:38:48.070890) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.355 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.355 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.376 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.469 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.470 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.477 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.477 226890 INFO nova.compute.claims [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:38:48 np0005588920 nova_compute[226886]: 2026-01-20 14:38:48.570 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:48.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3403050049' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.043 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.052 226890 DEBUG nova.compute.provider_tree [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.074 226890 DEBUG nova.scheduler.client.report [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.103 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.104 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.162 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.162 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.190 226890 INFO nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.216 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.375 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.376 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.377 226890 INFO nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Creating image(s)#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.409 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.444 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:49.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.483 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.489 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.520 226890 DEBUG nova.policy [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.555 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.557 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.558 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.558 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.595 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.600 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.835 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:49 np0005588920 nova_compute[226886]: 2026-01-20 14:38:49.931 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.330s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.029 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] resizing rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.084 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.197 226890 DEBUG nova.objects.instance [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'migration_context' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.222 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.223 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Ensure instance console log exists: /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.224 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.224 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.225 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:50 np0005588920 nova_compute[226886]: 2026-01-20 14:38:50.501 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Successfully created port: 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:38:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:50.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:51 np0005588920 podman[248581]: 2026-01-20 14:38:51.006621297 +0000 UTC m=+0.083519199 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.436 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Successfully updated port: 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.455 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.456 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.456 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:38:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:51.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.538 226890 DEBUG nova.compute.manager [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-changed-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.539 226890 DEBUG nova.compute.manager [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing instance network info cache due to event network-changed-82b46c8a-7331-4dba-b12c-3c4bd0d70a52. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.539 226890 DEBUG oslo_concurrency.lockutils [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:38:51 np0005588920 nova_compute[226886]: 2026-01-20 14:38:51.612 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.430 226890 DEBUG nova.network.neutron [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.454 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.454 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Instance network_info: |[{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.455 226890 DEBUG oslo_concurrency.lockutils [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.455 226890 DEBUG nova.network.neutron [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing network info cache for port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.458 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Start _get_guest_xml network_info=[{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.464 226890 WARNING nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.469 226890 DEBUG nova.virt.libvirt.host [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.470 226890 DEBUG nova.virt.libvirt.host [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.479 226890 DEBUG nova.virt.libvirt.host [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.480 226890 DEBUG nova.virt.libvirt.host [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.481 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.482 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.482 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.482 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.483 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.483 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.483 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.483 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.484 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.484 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.484 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.484 226890 DEBUG nova.virt.hardware [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.487 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:38:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3938428519' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.937 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.972 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:52 np0005588920 nova_compute[226886]: 2026-01-20 14:38:52.977 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:52.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:38:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/971674263' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.424 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.425 226890 DEBUG nova.virt.libvirt.vif [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:38:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.426 226890 DEBUG nova.network.os_vif_util [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.426 226890 DEBUG nova.network.os_vif_util [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.427 226890 DEBUG nova.objects.instance [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.445 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <uuid>c1a45fae-79ce-48c2-81b9-4d1e30165d46</uuid>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <name>instance-0000003e</name>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1231961321</nova:name>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:38:52</nova:creationTime>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <nova:port uuid="82b46c8a-7331-4dba-b12c-3c4bd0d70a52">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="serial">c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="uuid">c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:b2:0f:fa"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <target dev="tap82b46c8a-73"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log" append="off"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:38:53 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:38:53 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:38:53 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:38:53 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.447 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Preparing to wait for external event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.447 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.447 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.447 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.448 226890 DEBUG nova.virt.libvirt.vif [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:38:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.448 226890 DEBUG nova.network.os_vif_util [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.449 226890 DEBUG nova.network.os_vif_util [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.449 226890 DEBUG os_vif [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.450 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.450 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.451 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.452 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.453 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82b46c8a-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.453 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82b46c8a-73, col_values=(('external_ids', {'iface-id': '82b46c8a-7331-4dba-b12c-3c4bd0d70a52', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:0f:fa', 'vm-uuid': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:53 np0005588920 NetworkManager[49076]: <info>  [1768919933.4557] manager: (tap82b46c8a-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.459 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.461 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.462 226890 INFO os_vif [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73')#033[00m
Jan 20 09:38:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:53.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.519 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.519 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.520 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:b2:0f:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.520 226890 INFO nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Using config drive#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.542 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.855 226890 INFO nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Creating config drive at /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config#033[00m
Jan 20 09:38:53 np0005588920 nova_compute[226886]: 2026-01-20 14:38:53.868 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmxpcxbw9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.021 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmxpcxbw9" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.063 226890 DEBUG nova.storage.rbd_utils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.067 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.243 226890 DEBUG oslo_concurrency.processutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.243 226890 INFO nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Deleting local config drive /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/disk.config because it was imported into RBD.#033[00m
Jan 20 09:38:54 np0005588920 kernel: tap82b46c8a-73: entered promiscuous mode
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.3016] manager: (tap82b46c8a-73): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:54Z|00164|binding|INFO|Claiming lport 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 for this chassis.
Jan 20 09:38:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:54Z|00165|binding|INFO|82b46c8a-7331-4dba-b12c-3c4bd0d70a52: Claiming fa:16:3e:b2:0f:fa 10.100.0.13
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.319 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:0f:fa 10.100.0.13'], port_security=['fa:16:3e:b2:0f:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8c7b5066-a548-4ff0-b2e6-b25991426411', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=82b46c8a-7331-4dba-b12c-3c4bd0d70a52) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.320 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.322 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.335 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[562289c5-3fd8-4054-a487-c9a67f0b44b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.336 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc21b99b-41 in ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:38:54 np0005588920 systemd-machined[196121]: New machine qemu-26-instance-0000003e.
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.338 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc21b99b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.338 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[42a90f60-6859-4139-b4b5-4e719a0dbce4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.339 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc316ad1-a5a1-47c3-9369-1b7acc418384]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.351 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5688fb74-8a63-4b0e-a132-4b9df495eaea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 systemd[1]: Started Virtual Machine qemu-26-instance-0000003e.
Jan 20 09:38:54 np0005588920 systemd-udevd[248744]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.377 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7ed585-1f65-4a23-995d-1bd57c204bc5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.3919] device (tap82b46c8a-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.3934] device (tap82b46c8a-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:38:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:54Z|00166|binding|INFO|Setting lport 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 ovn-installed in OVS
Jan 20 09:38:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:54Z|00167|binding|INFO|Setting lport 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 up in Southbound
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.408 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[aa73c9f7-3291-4bec-b654-2f52bae1cf7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.413 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5c085e70-9d61-44e7-bc65-04c009d0e634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.4143] manager: (tapfc21b99b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Jan 20 09:38:54 np0005588920 systemd-udevd[248749]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.449 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b46ddd67-4532-4cff-a485-1e3977892cee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.451 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9e90196d-2018-4848-b6c3-16ac9821b7e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.4698] device (tapfc21b99b-40): carrier: link connected
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.476 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[93a9526b-428d-43dc-82c0-bc6ff0c7631e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.493 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b493705-3ff7-4eb0-a366-18ed86447fd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497615, 'reachable_time': 33551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248774, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.510 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b89f7e59-48b3-4b0c-b7fc-d533e724bdb8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:5bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497615, 'tstamp': 497615}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248775, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.525 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[322e7212-ed6f-420a-8458-6bc29e57c1d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497615, 'reachable_time': 33551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248776, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.550 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9731c698-988f-4dd5-abbf-eda2f2106e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.605 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[18e87b29-cc93-4750-ba3e-936dfec67019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.607 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.607 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.608 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 NetworkManager[49076]: <info>  [1768919934.6110] manager: (tapfc21b99b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 20 09:38:54 np0005588920 kernel: tapfc21b99b-40: entered promiscuous mode
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.612 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.616 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.617 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:38:54Z|00168|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.637 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.638 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ded16524-8af3-49a4-9286-db3288f7a492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.639 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:38:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:38:54.640 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'env', 'PROCESS_TAG=haproxy-fc21b99b-4e34-422c-be05-0a440009dac4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc21b99b-4e34-422c-be05-0a440009dac4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.856 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919934.8562605, c1a45fae-79ce-48c2-81b9-4d1e30165d46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.857 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] VM Started (Lifecycle Event)#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.875 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.879 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919934.8590221, c1a45fae-79ce-48c2-81b9-4d1e30165d46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.879 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.904 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.907 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.925 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:38:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:54.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.993 226890 DEBUG nova.network.neutron [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updated VIF entry in instance network info cache for port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:38:54 np0005588920 nova_compute[226886]: 2026-01-20 14:38:54.995 226890 DEBUG nova.network.neutron [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:38:55 np0005588920 nova_compute[226886]: 2026-01-20 14:38:55.016 226890 DEBUG oslo_concurrency.lockutils [req-a526fa68-1a33-4fff-bd78-6b3ca831fcbc req-4a7a8c84-2d3f-4023-a810-71f78b4ad303 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:38:55 np0005588920 nova_compute[226886]: 2026-01-20 14:38:55.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:55 np0005588920 podman[248848]: 2026-01-20 14:38:55.009184832 +0000 UTC m=+0.019508615 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:38:55 np0005588920 podman[248848]: 2026-01-20 14:38:55.118301144 +0000 UTC m=+0.128624907 container create 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:38:55 np0005588920 systemd[1]: Started libpod-conmon-16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf.scope.
Jan 20 09:38:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:38:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e28a20083f2fd6b1da8a912fd73e6fae8eada3d18adf7aa7a0f48694587e29ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:38:55 np0005588920 podman[248848]: 2026-01-20 14:38:55.225375859 +0000 UTC m=+0.235699642 container init 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:38:55 np0005588920 podman[248848]: 2026-01-20 14:38:55.234980757 +0000 UTC m=+0.245304520 container start 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 09:38:55 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [NOTICE]   (248867) : New worker (248869) forked
Jan 20 09:38:55 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [NOTICE]   (248867) : Loading success.
Jan 20 09:38:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:55.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:56.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.405 226890 DEBUG nova.compute.manager [req-797bf830-9a8a-444b-bfb7-163bff8a386f req-f7b9c4ae-5267-4c0b-82b5-9032c4247450 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.405 226890 DEBUG oslo_concurrency.lockutils [req-797bf830-9a8a-444b-bfb7-163bff8a386f req-f7b9c4ae-5267-4c0b-82b5-9032c4247450 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.406 226890 DEBUG oslo_concurrency.lockutils [req-797bf830-9a8a-444b-bfb7-163bff8a386f req-f7b9c4ae-5267-4c0b-82b5-9032c4247450 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.406 226890 DEBUG oslo_concurrency.lockutils [req-797bf830-9a8a-444b-bfb7-163bff8a386f req-f7b9c4ae-5267-4c0b-82b5-9032c4247450 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.406 226890 DEBUG nova.compute.manager [req-797bf830-9a8a-444b-bfb7-163bff8a386f req-f7b9c4ae-5267-4c0b-82b5-9032c4247450 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Processing event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.407 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.412 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919937.412147, c1a45fae-79ce-48c2-81b9-4d1e30165d46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.412 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.414 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.417 226890 INFO nova.virt.libvirt.driver [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Instance spawned successfully.#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.418 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.438 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.443 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.444 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.444 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.444 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.445 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.445 226890 DEBUG nova.virt.libvirt.driver [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.448 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:38:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:57.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.479 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.535 226890 INFO nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Took 8.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.536 226890 DEBUG nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.591 226890 INFO nova.compute.manager [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Took 9.16 seconds to build instance.#033[00m
Jan 20 09:38:57 np0005588920 nova_compute[226886]: 2026-01-20 14:38:57.606 226890 DEBUG oslo_concurrency.lockutils [None req-84f56ca8-b24d-4bda-854b-b0984d38fc54 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:58 np0005588920 nova_compute[226886]: 2026-01-20 14:38:58.455 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:58.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:38:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:59.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.498 226890 DEBUG nova.compute.manager [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.498 226890 DEBUG oslo_concurrency.lockutils [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.498 226890 DEBUG oslo_concurrency.lockutils [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.499 226890 DEBUG oslo_concurrency.lockutils [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.499 226890 DEBUG nova.compute.manager [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:38:59 np0005588920 nova_compute[226886]: 2026-01-20 14:38:59.499 226890 WARNING nova.compute.manager [req-a55ab625-dcdd-4cb7-b96a-86d0b1206907 req-ddec0577-a807-4687-b93d-dfe9afd89005 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:00 np0005588920 nova_compute[226886]: 2026-01-20 14:39:00.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:01.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:01 np0005588920 NetworkManager[49076]: <info>  [1768919941.3816] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:01 np0005588920 NetworkManager[49076]: <info>  [1768919941.3841] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 20 09:39:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:01.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:01Z|00169|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.618 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.927 226890 DEBUG nova.compute.manager [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-changed-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.927 226890 DEBUG nova.compute.manager [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing instance network info cache due to event network-changed-82b46c8a-7331-4dba-b12c-3c4bd0d70a52. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.927 226890 DEBUG oslo_concurrency.lockutils [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.927 226890 DEBUG oslo_concurrency.lockutils [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:01 np0005588920 nova_compute[226886]: 2026-01-20 14:39:01.928 226890 DEBUG nova.network.neutron [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing network info cache for port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:03.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:03 np0005588920 nova_compute[226886]: 2026-01-20 14:39:03.166 226890 DEBUG nova.network.neutron [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updated VIF entry in instance network info cache for port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:03 np0005588920 nova_compute[226886]: 2026-01-20 14:39:03.166 226890 DEBUG nova.network.neutron [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:03 np0005588920 nova_compute[226886]: 2026-01-20 14:39:03.190 226890 DEBUG oslo_concurrency.lockutils [req-8d1ff3fb-4805-4a5f-92a2-98af7f509bc9 req-8f61ee0a-63b8-4019-819e-9899cb04c5bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:03 np0005588920 nova_compute[226886]: 2026-01-20 14:39:03.458 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:03.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:03 np0005588920 podman[248879]: 2026-01-20 14:39:03.970349624 +0000 UTC m=+0.054844660 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 09:39:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:05.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:05 np0005588920 nova_compute[226886]: 2026-01-20 14:39:05.050 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:05.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.714 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.715 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.743 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.847 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.848 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.855 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.856 226890 INFO nova.compute.claims [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:39:06 np0005588920 nova_compute[226886]: 2026-01-20 14:39:06.968 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:07.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1592924687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.392 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.400 226890 DEBUG nova.compute.provider_tree [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.415 226890 DEBUG nova.scheduler.client.report [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.438 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.439 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:39:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:07.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.505 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.505 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.549 226890 INFO nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.619 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.740 226890 INFO nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Booting with volume 8e23d5c7-a222-4e45-8e31-6afe42582e8d at /dev/vda#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.898 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.899 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.910 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.911 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[15636b6f-7b72-4eae-8997-5fee83c4a243]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.912 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.921 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.922 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbcdc14-4558-444f-8cdc-daacfc4db523]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.923 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.933 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.933 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[678b24ad-69c9-4073-911c-0d31456850fe]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.935 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa1b684-b75e-4d8e-9d66-69fd97be71f3]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.935 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.962 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.964 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.965 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.965 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.965 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:39:07 np0005588920 nova_compute[226886]: 2026-01-20 14:39:07.965 226890 DEBUG nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating existing volume attachment record: e6db4319-d483-4b50-84db-2afa00acac8f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.133515) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948133589, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 251, "total_data_size": 527068, "memory_usage": 535680, "flush_reason": "Manual Compaction"}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948149133, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 285520, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36274, "largest_seqno": 36728, "table_properties": {"data_size": 283148, "index_size": 472, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6627, "raw_average_key_size": 20, "raw_value_size": 278274, "raw_average_value_size": 856, "num_data_blocks": 21, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919928, "oldest_key_time": 1768919928, "file_creation_time": 1768919948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 15701 microseconds, and 2467 cpu microseconds.
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.149229) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 285520 bytes OK
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.149249) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183167) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183243) EVENT_LOG_v1 {"time_micros": 1768919948183233, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183264) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 524240, prev total WAL file size 524240, number of live WAL files 2.
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183922) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303032' seq:72057594037927935, type:22 .. '6D6772737461740031323534' seq:0, type:0; will stop at (end)
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(278KB)], [66(11MB)]
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948183957, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 12221892, "oldest_snapshot_seqno": -1}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6055 keys, 8440090 bytes, temperature: kUnknown
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948291035, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8440090, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8400673, "index_size": 23179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 155671, "raw_average_key_size": 25, "raw_value_size": 8292853, "raw_average_value_size": 1369, "num_data_blocks": 926, "num_entries": 6055, "num_filter_entries": 6055, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768919948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.291323) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8440090 bytes
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.412813) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.0 rd, 78.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.4 +0.0 blob) out(8.0 +0.0 blob), read-write-amplify(72.4) write-amplify(29.6) OK, records in: 6562, records dropped: 507 output_compression: NoCompression
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.412847) EVENT_LOG_v1 {"time_micros": 1768919948412834, "job": 40, "event": "compaction_finished", "compaction_time_micros": 107172, "compaction_time_cpu_micros": 25336, "output_level": 6, "num_output_files": 1, "total_output_size": 8440090, "num_input_records": 6562, "num_output_records": 6055, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948413240, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948414944, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.415063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.415069) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.415073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.415077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:39:08.415079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588920 nova_compute[226886]: 2026-01-20 14:39:08.469 226890 DEBUG nova.policy [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd4ba32a01f74af199438da0b72e5a4d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8705404c3964472782118e478eb54e51', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:39:08 np0005588920 nova_compute[226886]: 2026-01-20 14:39:08.494 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:08 np0005588920 nova_compute[226886]: 2026-01-20 14:39:08.894 226890 INFO nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Booting with volume 41d04608-e9dd-4b22-8440-9cab803a24b7 at /dev/vdb#033[00m
Jan 20 09:39:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:09.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.022 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.023 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.035 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.035 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3d3201-ae76-44ac-80e4-d056f0282aa2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.037 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.044 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.045 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3b63121c-c427-4d06-98ea-61d9c63a3b1b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.047 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.056 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.056 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[991c8fc6-6d74-4815-a85e-6451c4fbd036]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.058 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f17408-e958-400a-995f-8ab2b76889d4]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.058 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.083 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.085 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.085 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.086 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.086 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] <== get_connector_properties: return (64ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.087 226890 DEBUG nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating existing volume attachment record: 2033d6c6-f159-4dd7-9887-2993ee4695aa _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:39:09 np0005588920 nova_compute[226886]: 2026-01-20 14:39:09.425 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully created port: 9b0fc629-48f5-469d-90d2-c26339c16eec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:09.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3539387829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.052 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.162 226890 INFO nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Booting with volume 419220b3-a6cb-447f-be9b-d4de4cac4b79 at /dev/vdc#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.262 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully created port: b03b1a08-920f-4340-b77b-37669cc14a07 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.317 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.318 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.327 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.328 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[80548ae6-df13-42ed-9211-99161bb469fc]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.329 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.336 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.337 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[36eebc68-5831-425c-9c8f-4b55c0179db0]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.338 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.345 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.345 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcf8ffe-daaa-46ed-96ce-2d968110f613]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.346 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[61cf2be0-ebf3-4a09-bb19-feb400fdfb7d]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.347 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.377 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "nvme version" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.379 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.380 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.380 226890 DEBUG os_brick.initiator.connectors.lightos [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.380 226890 DEBUG os_brick.utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] <== get_connector_properties: return (63ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.381 226890 DEBUG nova.virt.block_device [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating existing volume attachment record: a391e283-eb10-4224-b18a-131281c04977 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:39:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:10Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b2:0f:fa 10.100.0.13
Jan 20 09:39:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:10Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b2:0f:fa 10.100.0.13
Jan 20 09:39:10 np0005588920 nova_compute[226886]: 2026-01-20 14:39:10.804 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully created port: 7a84a919-d309-4f1a-99b8-4792dcdec990 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:11.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1234018470' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.434 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.435 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.436 226890 INFO nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Creating image(s)#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.436 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.437 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Ensure instance console log exists: /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.437 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.437 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.438 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:11.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:11 np0005588920 nova_compute[226886]: 2026-01-20 14:39:11.641 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully created port: ff70124b-befc-46fa-b2cb-bc4bd4a49942 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:12 np0005588920 nova_compute[226886]: 2026-01-20 14:39:12.216 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully created port: 5096b763-1b08-448f-a6bd-f63bcd65def6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:12 np0005588920 nova_compute[226886]: 2026-01-20 14:39:12.904 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: 9b0fc629-48f5-469d-90d2-c26339c16eec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:13.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.032 226890 DEBUG nova.compute.manager [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.033 226890 DEBUG nova.compute.manager [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-9b0fc629-48f5-469d-90d2-c26339c16eec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.033 226890 DEBUG oslo_concurrency.lockutils [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.034 226890 DEBUG oslo_concurrency.lockutils [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.034 226890 DEBUG nova.network.neutron [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port 9b0fc629-48f5-469d-90d2-c26339c16eec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.291 226890 DEBUG nova.network.neutron [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.497 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:39:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2354170290' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:39:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:39:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2354170290' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.690 226890 DEBUG nova.network.neutron [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.703 226890 DEBUG oslo_concurrency.lockutils [req-0bfa951b-8b9c-48e7-b314-51015b805883 req-0ef060e4-8c30-47d7-9b61-bc154da1c119 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:13 np0005588920 nova_compute[226886]: 2026-01-20 14:39:13.829 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: 96b6b319-e96c-4182-b940-9f154499e22d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:14 np0005588920 nova_compute[226886]: 2026-01-20 14:39:14.633 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: a6658829-ef19-4914-bbb3-35b718691c7c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:15.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.161 226890 DEBUG nova.compute.manager [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-96b6b319-e96c-4182-b940-9f154499e22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.162 226890 DEBUG nova.compute.manager [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-96b6b319-e96c-4182-b940-9f154499e22d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.162 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.162 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.163 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port 96b6b319-e96c-4182-b940-9f154499e22d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.381 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.499 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: b03b1a08-920f-4340-b77b-37669cc14a07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.739 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.755 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.755 226890 DEBUG nova.compute.manager [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-a6658829-ef19-4914-bbb3-35b718691c7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.756 226890 DEBUG nova.compute.manager [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-a6658829-ef19-4914-bbb3-35b718691c7c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.756 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.756 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.756 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port a6658829-ef19-4914-bbb3-35b718691c7c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:15 np0005588920 nova_compute[226886]: 2026-01-20 14:39:15.971 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:16.441 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:16.442 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:16.443 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:16 np0005588920 nova_compute[226886]: 2026-01-20 14:39:16.512 226890 DEBUG nova.network.neutron [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:16 np0005588920 nova_compute[226886]: 2026-01-20 14:39:16.534 226890 DEBUG oslo_concurrency.lockutils [req-c613d93e-91cd-400a-ba20-891470e3e689 req-c74e7294-53c5-469c-8b26-9d2cac399d10 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:16 np0005588920 nova_compute[226886]: 2026-01-20 14:39:16.549 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: 7a84a919-d309-4f1a-99b8-4792dcdec990 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.260 226890 DEBUG nova.compute.manager [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.260 226890 DEBUG nova.compute.manager [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-b03b1a08-920f-4340-b77b-37669cc14a07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.261 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.262 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.262 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port b03b1a08-920f-4340-b77b-37669cc14a07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.468 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.545 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: ff70124b-befc-46fa-b2cb-bc4bd4a49942 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.870 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.892 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.892 226890 DEBUG nova.compute.manager [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.893 226890 DEBUG nova.compute.manager [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-7a84a919-d309-4f1a-99b8-4792dcdec990. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.893 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.893 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:17 np0005588920 nova_compute[226886]: 2026-01-20 14:39:17.894 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port 7a84a919-d309-4f1a-99b8-4792dcdec990 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.308 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.456 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Successfully updated port: 5096b763-1b08-448f-a6bd-f63bcd65def6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.470 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.502 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.588 226890 DEBUG nova.network.neutron [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.602 226890 DEBUG oslo_concurrency.lockutils [req-97e93fff-7fa5-465e-aa7e-bd80f2a2a66f req-b4b94856-8333-40f5-93bf-794c6e0b98df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.603 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.603 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.777 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.804 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.805 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:18 np0005588920 nova_compute[226886]: 2026-01-20 14:39:18.806 226890 DEBUG nova.objects.instance [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:19.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.118 226890 DEBUG nova.objects.instance [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_requests' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.130 226890 DEBUG nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.374 226890 DEBUG nova.policy [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.423 226890 DEBUG nova.compute.manager [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.423 226890 DEBUG nova.compute.manager [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-ff70124b-befc-46fa-b2cb-bc4bd4a49942. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.424 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:19.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:19 np0005588920 nova_compute[226886]: 2026-01-20 14:39:19.901 226890 DEBUG nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Successfully created port: f6b42586-082e-4da5-b1fd-5723992197fe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.057 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.627 226890 DEBUG nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Successfully updated port: f6b42586-082e-4da5-b1fd-5723992197fe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.642 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.643 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.644 226890 DEBUG nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.716 226890 DEBUG nova.compute.manager [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-changed-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.716 226890 DEBUG nova.compute.manager [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing instance network info cache due to event network-changed-f6b42586-082e-4da5-b1fd-5723992197fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.717 226890 DEBUG oslo_concurrency.lockutils [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:20 np0005588920 nova_compute[226886]: 2026-01-20 14:39:20.869 226890 WARNING nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:39:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:21.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:21.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:22 np0005588920 podman[248942]: 2026-01-20 14:39:22.06216303 +0000 UTC m=+0.144463679 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:39:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:23.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:23.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.503 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:39:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.753 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.754 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.754 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.776 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:39:23 np0005588920 nova_compute[226886]: 2026-01-20 14:39:23.963 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.409 226890 DEBUG nova.network.neutron [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.429 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.430 226890 DEBUG oslo_concurrency.lockutils [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.430 226890 DEBUG nova.network.neutron [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Refreshing network info cache for port f6b42586-082e-4da5-b1fd-5723992197fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.433 226890 DEBUG nova.virt.libvirt.vif [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.434 226890 DEBUG nova.network.os_vif_util [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.435 226890 DEBUG nova.network.os_vif_util [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.435 226890 DEBUG os_vif [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.436 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.436 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.437 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.439 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.440 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6b42586-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.440 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6b42586-08, col_values=(('external_ids', {'iface-id': 'f6b42586-082e-4da5-b1fd-5723992197fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:e6:93', 'vm-uuid': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 NetworkManager[49076]: <info>  [1768919964.4541] manager: (tapf6b42586-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.459 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.461 226890 INFO os_vif [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08')#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.461 226890 DEBUG nova.virt.libvirt.vif [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.462 226890 DEBUG nova.network.os_vif_util [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.462 226890 DEBUG nova.network.os_vif_util [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.465 226890 DEBUG nova.virt.libvirt.guest [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:f1:e6:93"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <target dev="tapf6b42586-08"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:39:24 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:39:24 np0005588920 kernel: tapf6b42586-08: entered promiscuous mode
Jan 20 09:39:24 np0005588920 NetworkManager[49076]: <info>  [1768919964.4746] manager: (tapf6b42586-08): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Jan 20 09:39:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:24Z|00170|binding|INFO|Claiming lport f6b42586-082e-4da5-b1fd-5723992197fe for this chassis.
Jan 20 09:39:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:24Z|00171|binding|INFO|f6b42586-082e-4da5-b1fd-5723992197fe: Claiming fa:16:3e:f1:e6:93 10.100.0.14
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.483 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:e6:93 10.100.0.14'], port_security=['fa:16:3e:f1:e6:93 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f6b42586-082e-4da5-b1fd-5723992197fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.484 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f6b42586-082e-4da5-b1fd-5723992197fe in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.486 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:39:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:24Z|00172|binding|INFO|Setting lport f6b42586-082e-4da5-b1fd-5723992197fe ovn-installed in OVS
Jan 20 09:39:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:24Z|00173|binding|INFO|Setting lport f6b42586-082e-4da5-b1fd-5723992197fe up in Southbound
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.495 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.505 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a24dc154-7943-4283-95ed-b02f5fe2a3d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 systemd-udevd[249227]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:24 np0005588920 NetworkManager[49076]: <info>  [1768919964.5345] device (tapf6b42586-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:24 np0005588920 NetworkManager[49076]: <info>  [1768919964.5351] device (tapf6b42586-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.537 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b581a81f-d235-479b-b70c-feee552d3963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.540 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[40bc9384-6c42-4c9b-99c9-9256b6997dc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.549 226890 DEBUG nova.virt.libvirt.driver [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.549 226890 DEBUG nova.virt.libvirt.driver [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.549 226890 DEBUG nova.virt.libvirt.driver [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:b2:0f:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.550 226890 DEBUG nova.virt.libvirt.driver [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:f1:e6:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:39:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.570 226890 DEBUG nova.virt.libvirt.guest [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1231961321</nova:name>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:39:24</nova:creationTime>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:port uuid="82b46c8a-7331-4dba-b12c-3c4bd0d70a52">
Jan 20 09:39:24 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    <nova:port uuid="f6b42586-082e-4da5-b1fd-5723992197fe">
Jan 20 09:39:24 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:24 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:39:24 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:39:24 np0005588920 nova_compute[226886]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.572 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4805ac71-ebcd-4221-802f-43767dc34960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.589 226890 DEBUG oslo_concurrency.lockutils [None req-24ebb284-7834-434a-ab14-e124354fa379 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.592 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[65bf72e1-8372-4ba5-ac0b-7c7066d5fad4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497615, 'reachable_time': 33551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249233, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.607 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2555506f-bf45-4a57-9f92-91f2fe2346d6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497625, 'tstamp': 497625}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249234, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497628, 'tstamp': 497628}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249234, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.609 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 nova_compute[226886]: 2026-01-20 14:39:24.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.611 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.611 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.612 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:24.612 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:25.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.059 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.076 226890 DEBUG nova.compute.manager [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.076 226890 DEBUG oslo_concurrency.lockutils [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.076 226890 DEBUG oslo_concurrency.lockutils [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.077 226890 DEBUG oslo_concurrency.lockutils [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.077 226890 DEBUG nova.compute.manager [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:25 np0005588920 nova_compute[226886]: 2026-01-20 14:39:25.077 226890 WARNING nova.compute.manager [req-ab419b91-a12c-4a6b-bce8-3dc26c97c4a1 req-4948fa0e-3d0b-4dc4-bb5c-9cabce8a3106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:25.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:25Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:e6:93 10.100.0.14
Jan 20 09:39:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:25Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:e6:93 10.100.0.14
Jan 20 09:39:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:27.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.193 226890 DEBUG nova.compute.manager [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.194 226890 DEBUG oslo_concurrency.lockutils [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.194 226890 DEBUG oslo_concurrency.lockutils [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.195 226890 DEBUG oslo_concurrency.lockutils [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.195 226890 DEBUG nova.compute.manager [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.196 226890 WARNING nova.compute.manager [req-a639bb43-49d1-486f-b9f3-9c323f02700b req-046fbe07-84f5-4384-884e-f54ec1ef5c27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.475 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-f6b42586-082e-4da5-b1fd-5723992197fe" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.475 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-f6b42586-082e-4da5-b1fd-5723992197fe" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.492 226890 DEBUG nova.objects.instance [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:27.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.519 226890 DEBUG nova.virt.libvirt.vif [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.520 226890 DEBUG nova.network.os_vif_util [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.521 226890 DEBUG nova.network.os_vif_util [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.524 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.526 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.528 226890 DEBUG nova.virt.libvirt.driver [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Attempting to detach device tapf6b42586-08 from instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.529 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:f1:e6:93"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <target dev="tapf6b42586-08"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.538 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.542 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <name>instance-0000003e</name>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <uuid>c1a45fae-79ce-48c2-81b9-4d1e30165d46</uuid>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1231961321</nova:name>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:39:24</nova:creationTime>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:port uuid="82b46c8a-7331-4dba-b12c-3c4bd0d70a52">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:port uuid="f6b42586-082e-4da5-b1fd-5723992197fe">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='serial'>c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='uuid'>c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk' index='2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config' index='1'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:b2:0f:fa'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='tap82b46c8a-73'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:f1:e6:93'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='tapf6b42586-08'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='net1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source path='/dev/pts/0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log' append='off'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source path='/dev/pts/0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log' append='off'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c171,c564</label>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c171,c564</imagelabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.543 226890 INFO nova.virt.libvirt.driver [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tapf6b42586-08 from instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 from the persistent domain config.#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.543 226890 DEBUG nova.virt.libvirt.driver [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] (1/8): Attempting to detach device tapf6b42586-08 with device alias net1 from instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.543 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:f1:e6:93"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <target dev="tapf6b42586-08"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:39:27 np0005588920 kernel: tapf6b42586-08 (unregistering): left promiscuous mode
Jan 20 09:39:27 np0005588920 NetworkManager[49076]: <info>  [1768919967.5899] device (tapf6b42586-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:39:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:27Z|00174|binding|INFO|Releasing lport f6b42586-082e-4da5-b1fd-5723992197fe from this chassis (sb_readonly=0)
Jan 20 09:39:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:27Z|00175|binding|INFO|Setting lport f6b42586-082e-4da5-b1fd-5723992197fe down in Southbound
Jan 20 09:39:27 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:27Z|00176|binding|INFO|Removing iface tapf6b42586-08 ovn-installed in OVS
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.593 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.603 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:e6:93 10.100.0.14'], port_security=['fa:16:3e:f1:e6:93 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f6b42586-082e-4da5-b1fd-5723992197fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.606 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f6b42586-082e-4da5-b1fd-5723992197fe in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.610 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.612 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768919967.6123414, c1a45fae-79ce-48c2-81b9-4d1e30165d46 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.613 226890 DEBUG nova.virt.libvirt.driver [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Start waiting for the detach event from libvirt for device tapf6b42586-08 with device alias net1 for instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.614 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.618 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f1:e6:93"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf6b42586-08"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <name>instance-0000003e</name>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <uuid>c1a45fae-79ce-48c2-81b9-4d1e30165d46</uuid>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1231961321</nova:name>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:39:24</nova:creationTime>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:port uuid="82b46c8a-7331-4dba-b12c-3c4bd0d70a52">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:port uuid="f6b42586-082e-4da5-b1fd-5723992197fe">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='serial'>c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='uuid'>c1a45fae-79ce-48c2-81b9-4d1e30165d46</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk' index='2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/c1a45fae-79ce-48c2-81b9-4d1e30165d46_disk.config' index='1'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:b2:0f:fa'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target dev='tap82b46c8a-73'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source path='/dev/pts/0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log' append='off'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <source path='/dev/pts/0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46/console.log' append='off'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c171,c564</label>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c171,c564</imagelabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.618 226890 INFO nova.virt.libvirt.driver [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tapf6b42586-08 from instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 from the live domain config.#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.618 226890 DEBUG nova.virt.libvirt.vif [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.619 226890 DEBUG nova.network.os_vif_util [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.619 226890 DEBUG nova.network.os_vif_util [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.620 226890 DEBUG os_vif [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.621 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.621 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6b42586-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.623 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.624 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.626 226890 INFO os_vif [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08')#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.627 226890 DEBUG nova.virt.libvirt.guest [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1231961321</nova:name>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:39:27</nova:creationTime>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    <nova:port uuid="82b46c8a-7331-4dba-b12c-3c4bd0d70a52">
Jan 20 09:39:27 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:39:27 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:39:27 np0005588920 nova_compute[226886]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.629 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3502db29-809f-4630-afff-486cb750ce1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.656 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7c7472-ef73-451c-924f-c4ff1e1faaa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.659 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c2888442-a2d1-4eca-b1c8-3edae8ae959f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.681 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[063ded53-b3b2-4217-83e8-66244b3d7412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.696 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[59f9fb40-a09d-4169-93a2-0bcbda4a3592]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497615, 'reachable_time': 33551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249247, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.710 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b5dbeb-323e-4a8e-9f60-2d455232fa6d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497625, 'tstamp': 497625}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249248, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497628, 'tstamp': 497628}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249248, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.712 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:27 np0005588920 nova_compute[226886]: 2026-01-20 14:39:27.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.746 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.747 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.747 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:27.748 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.364 226890 DEBUG nova.network.neutron [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updated VIF entry in instance network info cache for port f6b42586-082e-4da5-b1fd-5723992197fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.365 226890 DEBUG nova.network.neutron [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.388 226890 DEBUG oslo_concurrency.lockutils [req-95982c95-7fd8-44c2-bb34-ca22a961cbc4 req-89f5bb78-d1b2-491f-8f86-cc13d7a44c4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.388 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.388 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.388 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.750 226890 DEBUG nova.network.neutron [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.786 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.786 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance network_info: |[{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.787 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.787 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port ff70124b-befc-46fa-b2cb-bc4bd4a49942 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.796 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Start _get_guest_xml network_info=[{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T1
Jan 20 09:39:28 np0005588920 nova_compute[226886]: in_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-8e23d5c7-a222-4e45-8e31-6afe42582e8d', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '8e23d5c7-a222-4e45-8e31-6afe42582e8d', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '795b0a95-448b-49b1-80cb-a18e84101480', 'attached_at': '', 'detached_at': '', 'volume_id': '8e23d5c7-a222-4e45-8e31-6afe42582e8d', 'serial': '8e23d5c7-a222-4e45-8e31-6afe42582e8d'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': 'e6db4319-d483-4b50-84db-2afa00acac8f', 'volume_type': None}, {'device_type': 'disk', 'boot_index': 1, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-41d04608-e9dd-4b22-8440-9cab803a24b7', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '41d04608-e9dd-4b22-8440-9cab803a24b7', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '795b0a95-448b-49b1-80cb-a18e84101480', 'attached_at': '', 'detached_at': '', 'volume_id': '41d04608-e9dd-4b22-8440-9cab803a24b7', 'serial': '41d04608-e9dd-4b22-8440-9cab803a24b7'}, 'mount_device': '/dev/vdb', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '2033d6c6-f159-4dd7-9887-2993ee4695aa', 'volume_type': None}, {'device_type': 'disk', 'boot_index': 2, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-419220b3-a6cb-447f-be9b-d4de4cac4b79', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '419220b3-a6cb-447f-be9b-d4de4cac4b79', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '795b0a95-448b-49b1-80cb-a18e84101480', 'attached_at': '', 'detached_at': '', 'volume_id': '419220b3-a6cb-447f-be9b-d4de4cac4b79', 'serial': '419220b3-a6cb-447f-be9b-d4de4cac4b79'}, 'mount_device': '/dev/vdc', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': 'a391e283-eb10-4224-b18a-131281c04977', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.800 226890 WARNING nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.810 226890 DEBUG nova.virt.libvirt.host [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.811 226890 DEBUG nova.virt.libvirt.host [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.814 226890 DEBUG nova.virt.libvirt.host [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.815 226890 DEBUG nova.virt.libvirt.host [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.816 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.816 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.816 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.816 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.817 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.818 226890 DEBUG nova.virt.hardware [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.845 226890 DEBUG nova.storage.rbd_utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] rbd image 795b0a95-448b-49b1-80cb-a18e84101480_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:28 np0005588920 nova_compute[226886]: 2026-01-20 14:39:28.849 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:29.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:29 np0005588920 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2026-01-20 14:39:28.796 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.254 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3331275778' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.343 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.363 226890 DEBUG nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-unplugged-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.364 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.364 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.365 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.365 226890 DEBUG nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-unplugged-f6b42586-082e-4da5-b1fd-5723992197fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.365 226890 WARNING nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-unplugged-f6b42586-082e-4da5-b1fd-5723992197fe for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.366 226890 DEBUG nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.366 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.366 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.367 226890 DEBUG oslo_concurrency.lockutils [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.367 226890 DEBUG nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.367 226890 WARNING nova.compute.manager [req-808dc92a-615f-4382-8c77-c8fe80c044f5 req-c56e1d82-953a-43d1-b784-31e5bde9fbce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-plugged-f6b42586-082e-4da5-b1fd-5723992197fe for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.415 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.416 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.417 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.418 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.418 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.419 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.420 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.421 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.421 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.422 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.423 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.424 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.424 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.425 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.426 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.427 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.427 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.428 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.429 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.429 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.430 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.431 226890 DEBUG nova.objects.instance [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lazy-loading 'pci_devices' on Instance uuid 795b0a95-448b-49b1-80cb-a18e84101480 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.445 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <uuid>795b0a95-448b-49b1-80cb-a18e84101480</uuid>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <name>instance-00000040</name>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:name>tempest-device-tagging-server-68911616</nova:name>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:39:28</nova:creationTime>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:user uuid="cd4ba32a01f74af199438da0b72e5a4d">tempest-TaggedBootDevicesTest_v242-1563719502-project-member</nova:user>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:project uuid="8705404c3964472782118e478eb54e51">tempest-TaggedBootDevicesTest_v242-1563719502</nova:project>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="9b0fc629-48f5-469d-90d2-c26339c16eec">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="96b6b319-e96c-4182-b940-9f154499e22d">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.91" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="a6658829-ef19-4914-bbb3-35b718691c7c">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.112" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="b03b1a08-920f-4340-b77b-37669cc14a07">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.214" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="7a84a919-d309-4f1a-99b8-4792dcdec990">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.32" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="ff70124b-befc-46fa-b2cb-bc4bd4a49942">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <nova:port uuid="5096b763-1b08-448f-a6bd-f63bcd65def6">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="serial">795b0a95-448b-49b1-80cb-a18e84101480</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="uuid">795b0a95-448b-49b1-80cb-a18e84101480</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/795b0a95-448b-49b1-80cb-a18e84101480_disk.config">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-8e23d5c7-a222-4e45-8e31-6afe42582e8d">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <serial>8e23d5c7-a222-4e45-8e31-6afe42582e8d</serial>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-41d04608-e9dd-4b22-8440-9cab803a24b7">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <serial>41d04608-e9dd-4b22-8440-9cab803a24b7</serial>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-419220b3-a6cb-447f-be9b-d4de4cac4b79">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="vdc" bus="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <serial>419220b3-a6cb-447f-be9b-d4de4cac4b79</serial>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:11:9c:9c"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tap9b0fc629-48"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1a:ef:b0"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tap96b6b319-e9"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:c8:a5:f5"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tapa6658829-ef"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:b9:ca:05"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tapb03b1a08-92"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:c7:9a:c3"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tap7a84a919-d3"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:97:d4:a5"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tapff70124b-be"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:5e:80:45"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <target dev="tap5096b763-1b"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/console.log" append="off"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:39:29 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:39:29 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:39:29 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:39:29 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.445 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.445 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.446 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.447 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Preparing to wait for external event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.450 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.450 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.450 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.451 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.451 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.452 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.452 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.453 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.457 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.458 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b0fc629-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.458 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b0fc629-48, col_values=(('external_ids', {'iface-id': '9b0fc629-48f5-469d-90d2-c26339c16eec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:9c:9c', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.460 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.4609] manager: (tap9b0fc629-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.462 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.468 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.470 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.471 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.471 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.472 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.472 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.473 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.473 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.476 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96b6b319-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.476 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96b6b319-e9, col_values=(('external_ids', {'iface-id': '96b6b319-e96c-4182-b940-9f154499e22d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:ef:b0', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.478 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.4796] manager: (tap96b6b319-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.480 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.487 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.489 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.489 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.490 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.490 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.491 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.491 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.491 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.491 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.494 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6658829-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.494 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6658829-ef, col_values=(('external_ids', {'iface-id': 'a6658829-ef19-4914-bbb3-35b718691c7c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:a5:f5', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.4961] manager: (tapa6658829-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.498 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.507 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.508 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.509 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.509 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:29.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.510 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.510 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.511 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.511 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.513 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb03b1a08-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.514 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb03b1a08-92, col_values=(('external_ids', {'iface-id': 'b03b1a08-920f-4340-b77b-37669cc14a07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:ca:05', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.5158] manager: (tapb03b1a08-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.531 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.532 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.532 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.533 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.533 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.533 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.534 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.536 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a84a919-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.536 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a84a919-d3, col_values=(('external_ids', {'iface-id': '7a84a919-d309-4f1a-99b8-4792dcdec990', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:9a:c3', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.5378] manager: (tap7a84a919-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.539 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.550 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.550 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.551 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.551 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.552 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.552 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.552 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.552 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.555 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.555 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff70124b-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.556 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff70124b-be, col_values=(('external_ids', {'iface-id': 'ff70124b-befc-46fa-b2cb-bc4bd4a49942', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:d4:a5', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.5578] manager: (tapff70124b-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.574 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.575 226890 DEBUG nova.virt.libvirt.vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.575 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.576 226890 DEBUG nova.network.os_vif_util [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.577 226890 DEBUG os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.578 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.579 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.581 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.581 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5096b763-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.582 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5096b763-1b, col_values=(('external_ids', {'iface-id': '5096b763-1b08-448f-a6bd-f63bcd65def6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:80:45', 'vm-uuid': '795b0a95-448b-49b1-80cb-a18e84101480'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588920 NetworkManager[49076]: <info>  [1768919969.5854] manager: (tap5096b763-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.588 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.606 226890 INFO os_vif [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b')#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.673 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.674 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.674 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] No VIF found with MAC fa:16:3e:11:9c:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.677 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] No VIF found with MAC fa:16:3e:c7:9a:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.678 226890 INFO nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Using config drive#033[00m
Jan 20 09:39:29 np0005588920 nova_compute[226886]: 2026-01-20 14:39:29.706 226890 DEBUG nova.storage.rbd_utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] rbd image 795b0a95-448b-49b1-80cb-a18e84101480_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.062 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.183 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.185 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.187 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.188 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.188 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.191 226890 INFO nova.compute.manager [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Terminating instance#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.193 226890 DEBUG nova.compute.manager [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.201 226890 INFO nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Creating config drive at /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.212 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvesjg4t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:30 np0005588920 kernel: tap82b46c8a-73 (unregistering): left promiscuous mode
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.2980] device (tap82b46c8a-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00177|binding|INFO|Releasing lport 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 from this chassis (sb_readonly=0)
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00178|binding|INFO|Setting lport 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 down in Southbound
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00179|binding|INFO|Removing iface tap82b46c8a-73 ovn-installed in OVS
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.329 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:0f:fa 10.100.0.13'], port_security=['fa:16:3e:b2:0f:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c1a45fae-79ce-48c2-81b9-4d1e30165d46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8c7b5066-a548-4ff0-b2e6-b25991426411', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=82b46c8a-7331-4dba-b12c-3c4bd0d70a52) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.330 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 82b46c8a-7331-4dba-b12c-3c4bd0d70a52 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.331 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc21b99b-4e34-422c-be05-0a440009dac4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.332 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[92fb7703-ceb6-4cc4-8d9f-b14f09b71e57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.332 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace which is not needed anymore#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.355 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvesjg4t" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:30 np0005588920 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 20 09:39:30 np0005588920 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003e.scope: Consumed 13.855s CPU time.
Jan 20 09:39:30 np0005588920 systemd-machined[196121]: Machine qemu-26-instance-0000003e terminated.
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.386 226890 DEBUG nova.storage.rbd_utils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] rbd image 795b0a95-448b-49b1-80cb-a18e84101480_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.391 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config 795b0a95-448b-49b1-80cb-a18e84101480_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [NOTICE]   (248867) : haproxy version is 2.8.14-c23fe91
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [NOTICE]   (248867) : path to executable is /usr/sbin/haproxy
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [WARNING]  (248867) : Exiting Master process...
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [WARNING]  (248867) : Exiting Master process...
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [ALERT]    (248867) : Current worker (248869) exited with code 143 (Terminated)
Jan 20 09:39:30 np0005588920 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[248863]: [WARNING]  (248867) : All workers exited. Exiting... (0)
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.4637] manager: (tap82b46c8a-73): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Jan 20 09:39:30 np0005588920 systemd[1]: libpod-16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf.scope: Deactivated successfully.
Jan 20 09:39:30 np0005588920 podman[249444]: 2026-01-20 14:39:30.471501986 +0000 UTC m=+0.049448249 container died 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.483 226890 INFO nova.virt.libvirt.driver [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Instance destroyed successfully.#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.485 226890 DEBUG nova.objects.instance [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'resources' on Instance uuid c1a45fae-79ce-48c2-81b9-4d1e30165d46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.504 226890 DEBUG nova.virt.libvirt.vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.505 226890 DEBUG nova.network.os_vif_util [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.506 226890 DEBUG nova.network.os_vif_util [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.506 226890 DEBUG os_vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:39:30 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf-userdata-shm.mount: Deactivated successfully.
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.510 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82b46c8a-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:30 np0005588920 systemd[1]: var-lib-containers-storage-overlay-e28a20083f2fd6b1da8a912fd73e6fae8eada3d18adf7aa7a0f48694587e29ef-merged.mount: Deactivated successfully.
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:30 np0005588920 podman[249444]: 2026-01-20 14:39:30.522328413 +0000 UTC m=+0.100274656 container cleanup 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:39:30 np0005588920 systemd[1]: libpod-conmon-16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf.scope: Deactivated successfully.
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.539 226890 INFO os_vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:0f:fa,bridge_name='br-int',has_traffic_filtering=True,id=82b46c8a-7331-4dba-b12c-3c4bd0d70a52,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82b46c8a-73')#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.540 226890 DEBUG nova.virt.libvirt.vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1231961321',display_name='tempest-AttachInterfacesTestJSON-server-1231961321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1231961321',id=62,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI6np03AtwHHtTwHRlB7NMAsNqIU3PHfCmtXu/3XUK0jxxLomgnnligcXSu/L+GIBA+09ag+WKdbpS2RoKjhZ7ql3UhQf0nGVICsCZpNRSjPMQDiRaRNyEakPPHow9Jn5w==',key_name='tempest-keypair-1916620246',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:38:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-3wujw3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:38:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=c1a45fae-79ce-48c2-81b9-4d1e30165d46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.540 226890 DEBUG nova.network.os_vif_util [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.541 226890 DEBUG nova.network.os_vif_util [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.541 226890 DEBUG os_vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.542 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.542 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6b42586-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.542 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.543 226890 INFO os_vif [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:e6:93,bridge_name='br-int',has_traffic_filtering=True,id=f6b42586-082e-4da5-b1fd-5723992197fe,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b42586-08')#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.564 226890 DEBUG oslo_concurrency.processutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config 795b0a95-448b-49b1-80cb-a18e84101480_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.564 226890 INFO nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Deleting local config drive /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480/disk.config because it was imported into RBD.#033[00m
Jan 20 09:39:30 np0005588920 podman[249514]: 2026-01-20 14:39:30.585291838 +0000 UTC m=+0.039466181 container remove 16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.590 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb48d84c-323b-46d1-822e-4555c05e4796]: (4, ('Tue Jan 20 02:39:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf)\n16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf\nTue Jan 20 02:39:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf)\n16df1645a457f2e256087e615c07c737e45bdd5e847fff2a3018ba0a2c1289bf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.591 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[38d553dd-9b28-4ec2-907f-73f2cca026a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.593 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:30 np0005588920 kernel: tapfc21b99b-40: left promiscuous mode
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6168] manager: (tap9b0fc629-48): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 20 09:39:30 np0005588920 kernel: tap9b0fc629-48: entered promiscuous mode
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.631 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5364db-14f2-4635-982b-b0d807eaae64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6342] manager: (tap96b6b319-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Jan 20 09:39:30 np0005588920 kernel: tap96b6b319-e9: entered promiscuous mode
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6477] manager: (tapa6658829-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 20 09:39:30 np0005588920 systemd-udevd[249571]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:30 np0005588920 systemd-udevd[249572]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:30 np0005588920 systemd-udevd[249575]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.651 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[74e673a7-0065-4fd4-a138-76b9f8936542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6610] manager: (tapb03b1a08-92): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Jan 20 09:39:30 np0005588920 kernel: tapb03b1a08-92: entered promiscuous mode
Jan 20 09:39:30 np0005588920 kernel: tapa6658829-ef: entered promiscuous mode
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.660 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a1028242-6f0f-4983-a7d2-94740662fcd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6626] device (tapa6658829-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00180|binding|INFO|Claiming lport 9b0fc629-48f5-469d-90d2-c26339c16eec for this chassis.
Jan 20 09:39:30 np0005588920 systemd-udevd[249583]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6644] device (tap9b0fc629-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6655] device (tapa6658829-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6661] device (tap9b0fc629-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00181|binding|INFO|9b0fc629-48f5-469d-90d2-c26339c16eec: Claiming fa:16:3e:11:9c:9c 10.100.0.8
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00182|binding|INFO|Claiming lport 96b6b319-e96c-4182-b940-9f154499e22d for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00183|binding|INFO|96b6b319-e96c-4182-b940-9f154499e22d: Claiming fa:16:3e:1a:ef:b0 10.1.1.91
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6702] device (tap96b6b319-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.670 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6710] device (tap96b6b319-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00184|if_status|INFO|Not updating pb chassis for b03b1a08-920f-4340-b77b-37669cc14a07 now as sb is readonly
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00185|binding|INFO|Claiming lport b03b1a08-920f-4340-b77b-37669cc14a07 for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00186|binding|INFO|b03b1a08-920f-4340-b77b-37669cc14a07: Claiming fa:16:3e:b9:ca:05 10.1.1.214
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00187|binding|INFO|Claiming lport a6658829-ef19-4914-bbb3-35b718691c7c for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00188|binding|INFO|a6658829-ef19-4914-bbb3-35b718691c7c: Claiming fa:16:3e:c8:a5:f5 10.1.1.112
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6757] device (tapb03b1a08-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.674 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:ef:b0 10.1.1.91'], port_security=['fa:16:3e:1a:ef:b0 10.1.1.91'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-791226081', 'neutron:cidrs': '10.1.1.91/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-791226081', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99955626-758a-4f99-af26-9b1cc95cd9d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=96b6b319-e96c-4182-b940-9f154499e22d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6767] device (tapb03b1a08-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.677 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:9c:9c 10.100.0.8'], port_security=['fa:16:3e:11:9c:9c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3a21065-10a5-474d-b42f-ffe66242a479', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=786e8670-14a0-43fd-9be5-ba76f5969fd0, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=9b0fc629-48f5-469d-90d2-c26339c16eec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.680 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:a5:f5 10.1.1.112'], port_security=['fa:16:3e:c8:a5:f5 10.1.1.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-2010021643', 'neutron:cidrs': '10.1.1.112/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-2010021643', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99955626-758a-4f99-af26-9b1cc95cd9d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=a6658829-ef19-4914-bbb3-35b718691c7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.681 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:ca:05 10.1.1.214'], port_security=['fa:16:3e:b9:ca:05 10.1.1.214'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.214/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b03b1a08-920f-4340-b77b-37669cc14a07) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.6841] manager: (tap7a84a919-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.685 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[09d47822-ffaa-4e1f-aad2-25a187436dec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497609, 'reachable_time': 39500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249587, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfc21b99b\x2d4e34\x2d422c\x2dbe05\x2d0a440009dac4.mount: Deactivated successfully.
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.688 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.688 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[09355826-db2b-4aea-acfc-9adcfb7144ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.690 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 96b6b319-e96c-4182-b940-9f154499e22d in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 bound to our chassis#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.691 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7051] manager: (tapff70124b-be): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.704 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[781c455c-f9d9-4b12-8203-0f250cb333b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.705 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd12cbe78-41 in ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.708 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd12cbe78-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.708 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[29c635a8-93dd-4810-a5ce-0991deae996d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.709 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f073409-8e76-451b-bafe-7b41fcbdf6fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.709 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updated VIF entry in instance network info cache for port ff70124b-befc-46fa-b2cb-bc4bd4a49942. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.710 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7181] manager: (tap5096b763-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.724 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3447a2b6-cde8-47a0-81f5-d6173d36e058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.731 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7349] device (tapff70124b-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 kernel: tapff70124b-be: entered promiscuous mode
Jan 20 09:39:30 np0005588920 kernel: tap7a84a919-d3: entered promiscuous mode
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7359] device (tap7a84a919-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 kernel: tap5096b763-1b: entered promiscuous mode
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7366] device (tap5096b763-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7374] device (tapff70124b-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7380] device (tap7a84a919-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7383] device (tap5096b763-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00189|binding|INFO|Claiming lport 7a84a919-d309-4f1a-99b8-4792dcdec990 for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00190|binding|INFO|7a84a919-d309-4f1a-99b8-4792dcdec990: Claiming fa:16:3e:c7:9a:c3 10.1.1.32
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00191|binding|INFO|Claiming lport 5096b763-1b08-448f-a6bd-f63bcd65def6 for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00192|binding|INFO|5096b763-1b08-448f-a6bd-f63bcd65def6: Claiming fa:16:3e:5e:80:45 10.2.2.200
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00193|binding|INFO|Claiming lport ff70124b-befc-46fa-b2cb-bc4bd4a49942 for this chassis.
Jan 20 09:39:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:30Z|00194|binding|INFO|ff70124b-befc-46fa-b2cb-bc4bd4a49942: Claiming fa:16:3e:97:d4:a5 10.2.2.100
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.741 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[59c5aa0b-14bc-4a41-b6ad-3ccd890404e5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.749 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.754 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d4:a5 10.2.2.100'], port_security=['fa:16:3e:97:d4:a5 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a477e986-22d6-46cb-827e-2c814ecbcffa, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ff70124b-befc-46fa-b2cb-bc4bd4a49942) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.756 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:80:45 10.2.2.200'], port_security=['fa:16:3e:5e:80:45 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a477e986-22d6-46cb-827e-2c814ecbcffa, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5096b763-1b08-448f-a6bd-f63bcd65def6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 systemd-machined[196121]: New machine qemu-27-instance-00000040.
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.758 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:9a:c3 10.1.1.32'], port_security=['fa:16:3e:c7:9a:c3 10.1.1.32'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.32/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7a84a919-d309-4f1a-99b8-4792dcdec990) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.759 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.759 226890 DEBUG nova.compute.manager [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.760 226890 DEBUG nova.compute.manager [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-5096b763-1b08-448f-a6bd-f63bcd65def6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.760 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.760 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.760 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port 5096b763-1b08-448f-a6bd-f63bcd65def6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.790 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6fa09d-8a90-46e4-9c85-77b3e8a4a7dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 systemd[1]: Started Virtual Machine qemu-27-instance-00000040.
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.7997] manager: (tapd12cbe78-40): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.798 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[552e97ee-2e2d-4068-9f9d-f9ccd952d55d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.833 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[632119cf-bf7e-4282-b03b-cf0dfb699037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.836 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f46a27d0-896f-4e1d-a0b3-f6420884426a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 NetworkManager[49076]: <info>  [1768919970.8584] device (tapd12cbe78-40): carrier: link connected
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.864 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[acf95ca1-c9f6-4580-a550-0b3ade3fe789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.879 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[86edc859-0f67-4943-bc8d-e1bffdfcca1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd12cbe78-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:9f:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501254, 'reachable_time': 38169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249640, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.897 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7cef7c93-6607-4c81-ad46-844db9190d97]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:9f5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501254, 'tstamp': 501254}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249641, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.918 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[88dbc25b-7ca1-4fd3-8b1b-675703ceb44a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd12cbe78-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:9f:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501254, 'reachable_time': 38169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249642, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.950 226890 INFO nova.virt.libvirt.driver [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Deleting instance files /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46_del#033[00m
Jan 20 09:39:30 np0005588920 nova_compute[226886]: 2026-01-20 14:39:30.951 226890 INFO nova.virt.libvirt.driver [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Deletion of /var/lib/nova/instances/c1a45fae-79ce-48c2-81b9-4d1e30165d46_del complete#033[00m
Jan 20 09:39:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:30.950 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a67b9455-5832-4dad-a445-8b436da333fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.003 226890 INFO nova.compute.manager [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.004 226890 DEBUG oslo.service.loopingcall [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.004 226890 DEBUG nova.compute.manager [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.005 226890 DEBUG nova.network.neutron [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.024 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[84a1e43a-1045-4c59-b9b5-06ff457a3825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.026 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12cbe78-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.026 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.026 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd12cbe78-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.028 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 NetworkManager[49076]: <info>  [1768919971.0288] manager: (tapd12cbe78-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 20 09:39:31 np0005588920 kernel: tapd12cbe78-40: entered promiscuous mode
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00195|binding|INFO|Setting lport 9b0fc629-48f5-469d-90d2-c26339c16eec ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00196|binding|INFO|Setting lport 9b0fc629-48f5-469d-90d2-c26339c16eec up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00197|binding|INFO|Setting lport 96b6b319-e96c-4182-b940-9f154499e22d ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00198|binding|INFO|Setting lport 96b6b319-e96c-4182-b940-9f154499e22d up in Southbound
Jan 20 09:39:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:31.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00199|binding|INFO|Setting lport 5096b763-1b08-448f-a6bd-f63bcd65def6 up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00200|binding|INFO|Setting lport 7a84a919-d309-4f1a-99b8-4792dcdec990 up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00201|binding|INFO|Setting lport ff70124b-befc-46fa-b2cb-bc4bd4a49942 up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00202|binding|INFO|Setting lport b03b1a08-920f-4340-b77b-37669cc14a07 up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00203|binding|INFO|Setting lport a6658829-ef19-4914-bbb3-35b718691c7c up in Southbound
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00204|binding|INFO|Setting lport 5096b763-1b08-448f-a6bd-f63bcd65def6 ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00205|binding|INFO|Setting lport 7a84a919-d309-4f1a-99b8-4792dcdec990 ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00206|binding|INFO|Setting lport ff70124b-befc-46fa-b2cb-bc4bd4a49942 ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00207|binding|INFO|Setting lport b03b1a08-920f-4340-b77b-37669cc14a07 ovn-installed in OVS
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00208|binding|INFO|Setting lport a6658829-ef19-4914-bbb3-35b718691c7c ovn-installed in OVS
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.071 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.072 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd12cbe78-40, col_values=(('external_ids', {'iface-id': '21e7bdd9-b254-47ae-9eff-81ffb6d9af00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.074 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.074 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d12cbe78-47c0-4f23-98a7-4ed621f8c3a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d12cbe78-47c0-4f23-98a7-4ed621f8c3a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.075 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9bb665-d7e2-4457-b8b8-ecf113dca423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.076 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/d12cbe78-47c0-4f23-98a7-4ed621f8c3a3.pid.haproxy
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID d12cbe78-47c0-4f23-98a7-4ed621f8c3a3
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00209|binding|INFO|Releasing lport 21e7bdd9-b254-47ae-9eff-81ffb6d9af00 from this chassis (sb_readonly=0)
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.077 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'env', 'PROCESS_TAG=haproxy-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d12cbe78-47c0-4f23-98a7-4ed621f8c3a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.226 226890 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-unplugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.227 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.227 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.227 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.227 226890 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-unplugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.227 226890 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-unplugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.228 226890 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.228 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.228 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.228 226890 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.228 226890 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] No waiting events found dispatching network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.229 226890 WARNING nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received unexpected event network-vif-plugged-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.256 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919971.2556293, 795b0a95-448b-49b1-80cb-a18e84101480 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.256 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] VM Started (Lifecycle Event)#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.298 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.303 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919971.2558146, 795b0a95-448b-49b1-80cb-a18e84101480 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.303 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.322 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.326 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.348 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.382 226890 DEBUG nova.compute.manager [req-e15c0981-81d2-4fe5-8e9a-e1438245b12f req-212e5bf9-c1d7-4379-a4a9-4b38017e3a69 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.382 226890 DEBUG oslo_concurrency.lockutils [req-e15c0981-81d2-4fe5-8e9a-e1438245b12f req-212e5bf9-c1d7-4379-a4a9-4b38017e3a69 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.383 226890 DEBUG oslo_concurrency.lockutils [req-e15c0981-81d2-4fe5-8e9a-e1438245b12f req-212e5bf9-c1d7-4379-a4a9-4b38017e3a69 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.383 226890 DEBUG oslo_concurrency.lockutils [req-e15c0981-81d2-4fe5-8e9a-e1438245b12f req-212e5bf9-c1d7-4379-a4a9-4b38017e3a69 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.383 226890 DEBUG nova.compute.manager [req-e15c0981-81d2-4fe5-8e9a-e1438245b12f req-212e5bf9-c1d7-4379-a4a9-4b38017e3a69 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:31 np0005588920 podman[249758]: 2026-01-20 14:39:31.416278525 +0000 UTC m=+0.043349239 container create 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:39:31 np0005588920 systemd[1]: Started libpod-conmon-75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97.scope.
Jan 20 09:39:31 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:39:31 np0005588920 podman[249758]: 2026-01-20 14:39:31.394679603 +0000 UTC m=+0.021750337 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:39:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c645bc67a6a8362f49f9aa55f94e8c822fcd92599a6c86d5356ba7d9f9472734/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.496 226890 DEBUG nova.compute.manager [req-7c13b44c-b89c-4f56-b82c-8efe2d27cea7 req-09244c89-e76f-4ceb-8f31-a8fc625528d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-deleted-f6b42586-082e-4da5-b1fd-5723992197fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.497 226890 INFO nova.compute.manager [req-7c13b44c-b89c-4f56-b82c-8efe2d27cea7 req-09244c89-e76f-4ceb-8f31-a8fc625528d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Neutron deleted interface f6b42586-082e-4da5-b1fd-5723992197fe; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.497 226890 DEBUG nova.network.neutron [req-7c13b44c-b89c-4f56-b82c-8efe2d27cea7 req-09244c89-e76f-4ceb-8f31-a8fc625528d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:31 np0005588920 podman[249758]: 2026-01-20 14:39:31.50862559 +0000 UTC m=+0.135696354 container init 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 09:39:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:31.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:31 np0005588920 podman[249758]: 2026-01-20 14:39:31.518374031 +0000 UTC m=+0.145444745 container start 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.521 226890 DEBUG nova.compute.manager [req-7c13b44c-b89c-4f56-b82c-8efe2d27cea7 req-09244c89-e76f-4ceb-8f31-a8fc625528d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Detach interface failed, port_id=f6b42586-082e-4da5-b1fd-5723992197fe, reason: Instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:39:31 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [NOTICE]   (249777) : New worker (249779) forked
Jan 20 09:39:31 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [NOTICE]   (249777) : Loading success.
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.571 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 9b0fc629-48f5-469d-90d2-c26339c16eec in datapath b3a21065-10a5-474d-b42f-ffe66242a479 unbound from our chassis#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.573 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3a21065-10a5-474d-b42f-ffe66242a479#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.584 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f426e6-ff30-4777-bb09-e0fa83ed33af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.585 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3a21065-11 in ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.587 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3a21065-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.587 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[99d93557-67f5-49cc-8b42-7fe06f78aea3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.589 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[61890c4d-5d88-42e7-8888-0ef4bb4a9fba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.600 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6b76fa07-fa04-4a22-b069-a7100ede19bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.615 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cbae4d64-7c69-49a4-9756-4bbd84b993fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.647 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9624ecc8-ec30-414c-8c63-27b00c157d98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/722968545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:31 np0005588920 NetworkManager[49076]: <info>  [1768919971.6544] manager: (tapb3a21065-10): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.654 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa93896-2564-4ed8-9f20-81583f625985]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.688 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3d223d63-6059-4c74-8ffb-3943a6953820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.691 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bc001570-af05-499c-be95-c854577a4ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 NetworkManager[49076]: <info>  [1768919971.7161] device (tapb3a21065-10): carrier: link connected
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.723 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[17756304-773c-49a0-95be-387662ef3a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.741 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7030c6-bd60-4ac9-a65a-d31bada23ceb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3a21065-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c6:78:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501340, 'reachable_time': 15551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249798, 'error': None, 'target': 'ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.760 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[858adfa1-4ab0-429e-a9f5-ff65d6e2d4cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec6:7845'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501340, 'tstamp': 501340}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249799, 'error': None, 'target': 'ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.777 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[14d0e288-1971-4ddf-9d1f-423f4d5dd2bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3a21065-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c6:78:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501340, 'reachable_time': 15551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249800, 'error': None, 'target': 'ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.812 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8dde0513-ed53-4ea9-b438-940a93c9e042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.863 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1108a4e8-4299-456a-b8e0-f3aa2a9c043f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.864 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3a21065-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.864 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.865 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3a21065-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.866 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 NetworkManager[49076]: <info>  [1768919971.8675] manager: (tapb3a21065-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 20 09:39:31 np0005588920 kernel: tapb3a21065-10: entered promiscuous mode
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.869 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3a21065-10, col_values=(('external_ids', {'iface-id': '17074171-3504-4b6b-920e-629def75ccc9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.870 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:31Z|00210|binding|INFO|Releasing lport 17074171-3504-4b6b-920e-629def75ccc9 from this chassis (sb_readonly=0)
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 nova_compute[226886]: 2026-01-20 14:39:31.889 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.889 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3a21065-10a5-474d-b42f-ffe66242a479.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3a21065-10a5-474d-b42f-ffe66242a479.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.890 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[69da219d-0dba-4df3-92cb-58e27d5c549e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.891 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b3a21065-10a5-474d-b42f-ffe66242a479
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b3a21065-10a5-474d-b42f-ffe66242a479.pid.haproxy
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b3a21065-10a5-474d-b42f-ffe66242a479
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:39:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:31.891 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479', 'env', 'PROCESS_TAG=haproxy-b3a21065-10a5-474d-b42f-ffe66242a479', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3a21065-10a5-474d-b42f-ffe66242a479.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.048 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6b42586-082e-4da5-b1fd-5723992197fe", "address": "fa:16:3e:f1:e6:93", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b42586-08", "ovs_interfaceid": "f6b42586-082e-4da5-b1fd-5723992197fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.072 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.073 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.073 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.073 226890 DEBUG nova.network.neutron [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.074 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.074 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.075 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.079 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.079 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.114 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.115 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.115 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.116 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.116 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:32 np0005588920 podman[249833]: 2026-01-20 14:39:32.260378507 +0000 UTC m=+0.062380790 container create 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:39:32 np0005588920 systemd[1]: Started libpod-conmon-84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf.scope.
Jan 20 09:39:32 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:39:32 np0005588920 podman[249833]: 2026-01-20 14:39:32.229560748 +0000 UTC m=+0.031563141 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:39:32 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b817755456659e24bdeebbbd56ba4e937c79185bfa874a84684085b765f21a2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:39:32 np0005588920 podman[249833]: 2026-01-20 14:39:32.341087257 +0000 UTC m=+0.143089540 container init 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:39:32 np0005588920 podman[249833]: 2026-01-20 14:39:32.345925742 +0000 UTC m=+0.147928025 container start 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:39:32 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [NOTICE]   (249870) : New worker (249872) forked
Jan 20 09:39:32 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [NOTICE]   (249870) : Loading success.
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.414 144128 INFO neutron.agent.ovn.metadata.agent [-] Port a6658829-ef19-4914-bbb3-35b718691c7c in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.416 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.433 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[75ac7613-d3a9-4342-a159-14187c3d826d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.469 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f9bc52bf-054a-4ab5-a8e9-552eba3e3136]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.472 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6b2efc-0dcb-461b-81be-1f1f955ebe2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.498 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[596f9e68-255c-45cf-a51d-aa254538d150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.518 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9a54a815-fbab-49cd-94c9-10db70fc1d7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd12cbe78-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:9f:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501254, 'reachable_time': 38169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249886, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.535 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3255a737-f0bc-48ac-9f75-2515f6835c14]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501266, 'tstamp': 501266}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249887, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501270, 'tstamp': 501270}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249887, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2096865750' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.537 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12cbe78-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.580 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.582 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd12cbe78-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.582 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.583 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd12cbe78-40, col_values=(('external_ids', {'iface-id': '21e7bdd9-b254-47ae-9eff-81ffb6d9af00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.583 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.584 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b03b1a08-920f-4340-b77b-37669cc14a07 in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.587 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.592 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.599 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9c31dc55-6551-4566-88ef-2c05a6679e7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.622 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a68b624a-12a9-4351-aff3-e535963e6f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.625 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[68b98a3f-0e0c-48e0-8ab4-7b81da63319e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.646 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d3bd878e-3859-4fcd-b936-b6a918beeb9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.660 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[367c46df-82cb-44df-b7a3-4ea8d62c619a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd12cbe78-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:9f:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 266, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 266, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501254, 'reachable_time': 38169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249896, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.672 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[87f92050-dae7-456d-92b2-73ab7aee0234]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501266, 'tstamp': 501266}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249897, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501270, 'tstamp': 501270}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249897, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.673 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12cbe78-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd12cbe78-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd12cbe78-40, col_values=(('external_ids', {'iface-id': '21e7bdd9-b254-47ae-9eff-81ffb6d9af00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.678 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ff70124b-befc-46fa-b2cb-bc4bd4a49942 in datapath a36edd9d-12ce-4779-97fe-f75c00d85dcd unbound from our chassis#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.679 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.680 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a36edd9d-12ce-4779-97fe-f75c00d85dcd#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.688 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.688 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.689 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.689 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4139beca-f6d5-44d7-b562-a22f786b668d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.689 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa36edd9d-11 in ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.689 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000040 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.692 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa36edd9d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.692 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d57a2c2-f821-4e36-af90-1372099fb8f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.693 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb93cc9-4320-4b88-bba3-6b0f6a1c0c8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.705 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca81cde-abe9-48e5-946c-06c71f80e946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.730 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[34c95aa6-f017-4b5b-83ac-bc7d3ee660e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.760 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[26f6b570-fcaf-4fd9-b479-b733a45949ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.766 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[287a07dc-4b64-46a5-a4e3-05f33d99ad86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 NetworkManager[49076]: <info>  [1768919972.7683] manager: (tapa36edd9d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.793 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d355fb-b6a3-40de-9bdb-193f939508a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.796 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a119fcf4-bc66-4392-9716-3adcf6c8367b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.811 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updated VIF entry in instance network info cache for port 5096b763-1b08-448f-a6bd-f63bcd65def6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.811 226890 DEBUG nova.network.neutron [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:32 np0005588920 NetworkManager[49076]: <info>  [1768919972.8183] device (tapa36edd9d-10): carrier: link connected
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.824 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[378092f3-9446-474a-8bda-67b35f3851d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.830 226890 DEBUG oslo_concurrency.lockutils [req-bfa29c13-cab6-4daf-9ad7-9e485f8fccc2 req-01c224ce-f871-4143-a5fc-66d579e627ba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.841 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17acac6c-4e20-42de-b099-cb623161fd6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36edd9d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501450, 'reachable_time': 19116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249908, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.859 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[823de967-e064-4985-875d-3d887ad953bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:e3f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501450, 'tstamp': 501450}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249909, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.876 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbf5f6c-0ee1-4524-878a-abdddacd510e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36edd9d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501450, 'reachable_time': 19116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249910, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.898 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.899 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4484MB free_disk=20.852645874023438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.899 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.900 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.905 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.905 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.905 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.905 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No event matching network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d in dict_keys([('network-vif-plugged', '9b0fc629-48f5-469d-90d2-c26339c16eec'), ('network-vif-plugged', 'a6658829-ef19-4914-bbb3-35b718691c7c'), ('network-vif-plugged', 'b03b1a08-920f-4340-b77b-37669cc14a07'), ('network-vif-plugged', 'ff70124b-befc-46fa-b2cb-bc4bd4a49942'), ('network-vif-plugged', '5096b763-1b08-448f-a6bd-f63bcd65def6')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.906 226890 WARNING nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.907 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[33ec6abe-0428-42b8-bf1d-51016d6088b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.907 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No event matching network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c in dict_keys([('network-vif-plugged', '9b0fc629-48f5-469d-90d2-c26339c16eec'), ('network-vif-plugged', 'b03b1a08-920f-4340-b77b-37669cc14a07'), ('network-vif-plugged', 'ff70124b-befc-46fa-b2cb-bc4bd4a49942'), ('network-vif-plugged', '5096b763-1b08-448f-a6bd-f63bcd65def6')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 WARNING nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.908 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.909 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.909 226890 DEBUG oslo_concurrency.lockutils [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.909 226890 DEBUG nova.compute.manager [req-3a778949-eeb8-423b-bfc7-c0bdb7ac7fe0 req-2d3a3e2e-e212-47a4-927f-45bb73456a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.953 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[55c2f7db-6893-4f20-863c-8b0f66575d53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.954 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36edd9d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.954 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.955 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa36edd9d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.956 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 NetworkManager[49076]: <info>  [1768919972.9574] manager: (tapa36edd9d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 20 09:39:32 np0005588920 kernel: tapa36edd9d-10: entered promiscuous mode
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.959 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa36edd9d-10, col_values=(('external_ids', {'iface-id': '56d835cb-2d2e-46bd-b178-20a6f3fad645'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:32Z|00211|binding|INFO|Releasing lport 56d835cb-2d2e-46bd-b178-20a6f3fad645 from this chassis (sb_readonly=0)
Jan 20 09:39:32 np0005588920 nova_compute[226886]: 2026-01-20 14:39:32.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.975 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a36edd9d-12ce-4779-97fe-f75c00d85dcd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a36edd9d-12ce-4779-97fe-f75c00d85dcd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.976 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44699590-f281-458d-8007-d2e53639f780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.976 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-a36edd9d-12ce-4779-97fe-f75c00d85dcd
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/a36edd9d-12ce-4779-97fe-f75c00d85dcd.pid.haproxy
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID a36edd9d-12ce-4779-97fe-f75c00d85dcd
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:39:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:32.978 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'env', 'PROCESS_TAG=haproxy-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a36edd9d-12ce-4779-97fe-f75c00d85dcd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.001 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance c1a45fae-79ce-48c2-81b9-4d1e30165d46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.001 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 795b0a95-448b-49b1-80cb-a18e84101480 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.001 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.001 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.062 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:33.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.087 226890 DEBUG nova.network.neutron [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.103 226890 INFO nova.compute.manager [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Took 2.10 seconds to deallocate network for instance.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.145 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 podman[249963]: 2026-01-20 14:39:33.358565052 +0000 UTC m=+0.052731531 container create 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:39:33 np0005588920 systemd[1]: Started libpod-conmon-2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2.scope.
Jan 20 09:39:33 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:39:33 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a0924e85c686d8695d859d46be9da4f5e6c4cc11a5cef51ccca9b849a2f1e0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:39:33 np0005588920 podman[249963]: 2026-01-20 14:39:33.333111332 +0000 UTC m=+0.027277831 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:39:33 np0005588920 podman[249963]: 2026-01-20 14:39:33.43132332 +0000 UTC m=+0.125489849 container init 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:39:33 np0005588920 podman[249963]: 2026-01-20 14:39:33.436747432 +0000 UTC m=+0.130913921 container start 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:39:33 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [NOTICE]   (249982) : New worker (249984) forked
Jan 20 09:39:33 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [NOTICE]   (249982) : Loading success.
Jan 20 09:39:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2502978402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.504 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.509 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.512 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5096b763-1b08-448f-a6bd-f63bcd65def6 in datapath a36edd9d-12ce-4779-97fe-f75c00d85dcd unbound from our chassis#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.514 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a36edd9d-12ce-4779-97fe-f75c00d85dcd#033[00m
Jan 20 09:39:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:33.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.515 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.516 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.517 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.517 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.517 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No event matching network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 in dict_keys([('network-vif-plugged', '9b0fc629-48f5-469d-90d2-c26339c16eec'), ('network-vif-plugged', 'b03b1a08-920f-4340-b77b-37669cc14a07'), ('network-vif-plugged', 'ff70124b-befc-46fa-b2cb-bc4bd4a49942')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.517 226890 WARNING nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.517 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.518 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.519 226890 DEBUG oslo_concurrency.lockutils [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.519 226890 DEBUG nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No event matching network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 in dict_keys([('network-vif-plugged', '9b0fc629-48f5-469d-90d2-c26339c16eec'), ('network-vif-plugged', 'b03b1a08-920f-4340-b77b-37669cc14a07')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.519 226890 WARNING nova.compute.manager [req-9938d322-7b85-4f59-a76d-e8221d260062 req-6bf867f1-b386-4038-9672-83d2b349dff9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.524 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.529 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5deecc8b-5c7e-4b1f-b6d5-1f5c33a749d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.553 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d81dbcc4-a840-44ec-8f39-10486ae58612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.555 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[30146493-a34d-425f-8945-c090bb9bfd80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.556 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.556 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.556 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.571 226890 DEBUG nova.compute.manager [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.571 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.571 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.571 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.571 226890 DEBUG nova.compute.manager [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 DEBUG nova.compute.manager [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 DEBUG oslo_concurrency.lockutils [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 DEBUG nova.compute.manager [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No event matching network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 in dict_keys([('network-vif-plugged', '9b0fc629-48f5-469d-90d2-c26339c16eec')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.572 226890 WARNING nova.compute.manager [req-5c642d98-15e9-40fe-98a9-7f828ccdac9a req-d78f1e6b-eee4-4663-9d15-b51c385c22ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.577 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed0e8b2-527e-4725-bac5-eb166123b667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.593 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ced56ebe-e0f1-4622-a8c0-d861052e2683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36edd9d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 5, 'rx_bytes': 180, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 5, 'rx_bytes': 180, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501450, 'reachable_time': 19116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250000, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.607 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c726f646-02ff-4ad5-937e-07f6867d7b3c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa36edd9d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501461, 'tstamp': 501461}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250001, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tapa36edd9d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501463, 'tstamp': 501463}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250001, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.609 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36edd9d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.612 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.613 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa36edd9d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.613 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.613 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa36edd9d-10, col_values=(('external_ids', {'iface-id': '56d835cb-2d2e-46bd-b178-20a6f3fad645'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.613 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.614 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7a84a919-d309-4f1a-99b8-4792dcdec990 in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.616 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.620 226890 DEBUG nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.620 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Processing event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.621 226890 DEBUG oslo_concurrency.lockutils [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.622 226890 DEBUG nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.622 226890 WARNING nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.622 226890 DEBUG nova.compute.manager [req-34111343-4bba-4800-aa0f-9d8e27427f51 req-0f7a9bc5-d28f-491f-a188-b60c4537a8a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Received event network-vif-deleted-82b46c8a-7331-4dba-b12c-3c4bd0d70a52 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.623 226890 DEBUG oslo_concurrency.processutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.629 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d23f42c1-7eb5-47b9-8bcd-f7bafc4517c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.646 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.651 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768919973.6514697, 795b0a95-448b-49b1-80cb-a18e84101480 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.651 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.654 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.658 226890 INFO nova.virt.libvirt.driver [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance spawned successfully.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.659 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.662 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[07e176f5-4cfd-4ea0-8dd9-286c772a5ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.665 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff614a7-6b52-4287-bba1-61a196648842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.674 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.684 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.687 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[630f6656-25f8-48fa-a518-ea6126449ba4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.688 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.688 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.688 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.689 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.689 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.689 226890 DEBUG nova.virt.libvirt.driver [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.701 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[555b3356-bf5c-4017-877d-17545579386e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd12cbe78-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:9f:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501254, 'reachable_time': 38169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250008, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.717 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6fe077-60c1-4827-ad88-f201ac6731e8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501266, 'tstamp': 501266}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250009, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapd12cbe78-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501270, 'tstamp': 501270}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250009, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.718 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12cbe78-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.719 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.720 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.721 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd12cbe78-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.721 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.722 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd12cbe78-40, col_values=(('external_ids', {'iface-id': '21e7bdd9-b254-47ae-9eff-81ffb6d9af00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.722 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:33.724 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.724 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.768 226890 INFO nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Took 22.33 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.769 226890 DEBUG nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.850 226890 INFO nova.network.neutron [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Port f6b42586-082e-4da5-b1fd-5723992197fe from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.850 226890 DEBUG nova.network.neutron [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Updating instance_info_cache with network_info: [{"id": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "address": "fa:16:3e:b2:0f:fa", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82b46c8a-73", "ovs_interfaceid": "82b46c8a-7331-4dba-b12c-3c4bd0d70a52", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.852 226890 INFO nova.compute.manager [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Took 27.03 seconds to build instance.#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.888 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-c1a45fae-79ce-48c2-81b9-4d1e30165d46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.913 226890 DEBUG oslo_concurrency.lockutils [None req-4a8dd796-3d53-47fb-bffa-7c56caf5e8d3 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-c1a45fae-79ce-48c2-81b9-4d1e30165d46-f6b42586-082e-4da5-b1fd-5723992197fe" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 6.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:33 np0005588920 nova_compute[226886]: 2026-01-20 14:39:33.914 226890 DEBUG oslo_concurrency.lockutils [None req-9d8bff96-fbef-430c-a43c-278ed27877ff cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1735536733' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.081 226890 DEBUG oslo_concurrency.processutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.086 226890 DEBUG nova.compute.provider_tree [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.100 226890 DEBUG nova.scheduler.client.report [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.125 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.155 226890 INFO nova.scheduler.client.report [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Deleted allocations for instance c1a45fae-79ce-48c2-81b9-4d1e30165d46#033[00m
Jan 20 09:39:34 np0005588920 nova_compute[226886]: 2026-01-20 14:39:34.227 226890 DEBUG oslo_concurrency.lockutils [None req-9be67e7e-3bfe-4672-8f65-ee9d2ff986a5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "c1a45fae-79ce-48c2-81b9-4d1e30165d46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:34 np0005588920 podman[250031]: 2026-01-20 14:39:34.971050766 +0000 UTC m=+0.059376697 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.058 226890 DEBUG nova.compute.manager [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.058 226890 DEBUG oslo_concurrency.lockutils [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.058 226890 DEBUG oslo_concurrency.lockutils [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.059 226890 DEBUG oslo_concurrency.lockutils [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.059 226890 DEBUG nova.compute.manager [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.059 226890 WARNING nova.compute.manager [req-1eecb461-e251-41be-abb1-171a87bc58f1 req-38dde074-e472-4fa9-be00-6804dd7fa8cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:35.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:35 np0005588920 nova_compute[226886]: 2026-01-20 14:39:35.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:35.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:36 np0005588920 NetworkManager[49076]: <info>  [1768919976.3296] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 20 09:39:36 np0005588920 nova_compute[226886]: 2026-01-20 14:39:36.328 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:36 np0005588920 NetworkManager[49076]: <info>  [1768919976.3305] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 20 09:39:36 np0005588920 nova_compute[226886]: 2026-01-20 14:39:36.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:36 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:36Z|00212|binding|INFO|Releasing lport 21e7bdd9-b254-47ae-9eff-81ffb6d9af00 from this chassis (sb_readonly=0)
Jan 20 09:39:36 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:36Z|00213|binding|INFO|Releasing lport 56d835cb-2d2e-46bd-b178-20a6f3fad645 from this chassis (sb_readonly=0)
Jan 20 09:39:36 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:36Z|00214|binding|INFO|Releasing lport 17074171-3504-4b6b-920e-629def75ccc9 from this chassis (sb_readonly=0)
Jan 20 09:39:36 np0005588920 nova_compute[226886]: 2026-01-20 14:39:36.542 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:37.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:37.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:37 np0005588920 nova_compute[226886]: 2026-01-20 14:39:37.763 226890 DEBUG nova.compute.manager [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-changed-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:37 np0005588920 nova_compute[226886]: 2026-01-20 14:39:37.764 226890 DEBUG nova.compute.manager [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing instance network info cache due to event network-changed-9b0fc629-48f5-469d-90d2-c26339c16eec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:37 np0005588920 nova_compute[226886]: 2026-01-20 14:39:37.764 226890 DEBUG oslo_concurrency.lockutils [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:37 np0005588920 nova_compute[226886]: 2026-01-20 14:39:37.765 226890 DEBUG oslo_concurrency.lockutils [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:37 np0005588920 nova_compute[226886]: 2026-01-20 14:39:37.766 226890 DEBUG nova.network.neutron [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Refreshing network info cache for port 9b0fc629-48f5-469d-90d2-c26339c16eec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:39.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:39:39.725 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:39.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:39 np0005588920 nova_compute[226886]: 2026-01-20 14:39:39.929 226890 DEBUG nova.network.neutron [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updated VIF entry in instance network info cache for port 9b0fc629-48f5-469d-90d2-c26339c16eec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:39 np0005588920 nova_compute[226886]: 2026-01-20 14:39:39.930 226890 DEBUG nova.network.neutron [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:39 np0005588920 nova_compute[226886]: 2026-01-20 14:39:39.962 226890 DEBUG oslo_concurrency.lockutils [req-ba29a81f-4a63-4d6e-9842-52789c3bb056 req-ce7cccad-1765-4594-a001-33ac62ec7823 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-795b0a95-448b-49b1-80cb-a18e84101480" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:40 np0005588920 nova_compute[226886]: 2026-01-20 14:39:40.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:40Z|00215|binding|INFO|Releasing lport 21e7bdd9-b254-47ae-9eff-81ffb6d9af00 from this chassis (sb_readonly=0)
Jan 20 09:39:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:40Z|00216|binding|INFO|Releasing lport 56d835cb-2d2e-46bd-b178-20a6f3fad645 from this chassis (sb_readonly=0)
Jan 20 09:39:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:40Z|00217|binding|INFO|Releasing lport 17074171-3504-4b6b-920e-629def75ccc9 from this chassis (sb_readonly=0)
Jan 20 09:39:40 np0005588920 nova_compute[226886]: 2026-01-20 14:39:40.377 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588920 nova_compute[226886]: 2026-01-20 14:39:40.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588920 nova_compute[226886]: 2026-01-20 14:39:40.523 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:41.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:41.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:43.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:43.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:45 np0005588920 nova_compute[226886]: 2026-01-20 14:39:45.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:45.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:45 np0005588920 nova_compute[226886]: 2026-01-20 14:39:45.481 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919970.480538, c1a45fae-79ce-48c2-81b9-4d1e30165d46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:45 np0005588920 nova_compute[226886]: 2026-01-20 14:39:45.482 226890 INFO nova.compute.manager [-] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:39:45 np0005588920 nova_compute[226886]: 2026-01-20 14:39:45.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:45 np0005588920 nova_compute[226886]: 2026-01-20 14:39:45.691 226890 DEBUG nova.compute.manager [None req-2095c6b4-e8b1-4ca9-9d1e-852d130df040 - - - - - -] [instance: c1a45fae-79ce-48c2-81b9-4d1e30165d46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:45.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:47.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:47.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:a5:f5 10.1.1.112
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:a5:f5 10.1.1.112
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:d4:a5 10.2.2.100
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:d4:a5 10.2.2.100
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c7:9a:c3 10.1.1.32
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c7:9a:c3 10.1.1.32
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:9c:9c 10.100.0.8
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:9c:9c 10.100.0.8
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:80:45 10.2.2.200
Jan 20 09:39:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:48Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:80:45 10.2.2.200
Jan 20 09:39:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:49.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:49 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:49Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1a:ef:b0 10.1.1.91
Jan 20 09:39:49 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:49Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1a:ef:b0 10.1.1.91
Jan 20 09:39:49 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:49Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:ca:05 10.1.1.214
Jan 20 09:39:49 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:49Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:ca:05 10.1.1.214
Jan 20 09:39:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:49.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:50 np0005588920 nova_compute[226886]: 2026-01-20 14:39:50.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:50 np0005588920 nova_compute[226886]: 2026-01-20 14:39:50.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:51.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:51.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:53 np0005588920 podman[250055]: 2026-01-20 14:39:53.019400719 +0000 UTC m=+0.088410826 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:39:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:53.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:53 np0005588920 nova_compute[226886]: 2026-01-20 14:39:53.459 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:53.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:55 np0005588920 nova_compute[226886]: 2026-01-20 14:39:55.077 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:55.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:55 np0005588920 nova_compute[226886]: 2026-01-20 14:39:55.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:55.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:57.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:57 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:57Z|00218|binding|INFO|Releasing lport 21e7bdd9-b254-47ae-9eff-81ffb6d9af00 from this chassis (sb_readonly=0)
Jan 20 09:39:57 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:57Z|00219|binding|INFO|Releasing lport 56d835cb-2d2e-46bd-b178-20a6f3fad645 from this chassis (sb_readonly=0)
Jan 20 09:39:57 np0005588920 ovn_controller[133971]: 2026-01-20T14:39:57Z|00220|binding|INFO|Releasing lport 17074171-3504-4b6b-920e-629def75ccc9 from this chassis (sb_readonly=0)
Jan 20 09:39:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:57.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:57 np0005588920 nova_compute[226886]: 2026-01-20 14:39:57.809 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:57 np0005588920 nova_compute[226886]: 2026-01-20 14:39:57.958 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:59.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:39:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:59.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:00 np0005588920 nova_compute[226886]: 2026-01-20 14:40:00.079 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:00 np0005588920 nova_compute[226886]: 2026-01-20 14:40:00.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:01 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:40:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:01.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:01.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:03.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:03.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:05 np0005588920 nova_compute[226886]: 2026-01-20 14:40:05.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:05.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:05 np0005588920 nova_compute[226886]: 2026-01-20 14:40:05.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:05 np0005588920 nova_compute[226886]: 2026-01-20 14:40:05.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:05.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:05 np0005588920 podman[250083]: 2026-01-20 14:40:05.986322022 +0000 UTC m=+0.075374842 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 20 09:40:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:07.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:07.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:09.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:09 np0005588920 nova_compute[226886]: 2026-01-20 14:40:09.657 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:09.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:10 np0005588920 nova_compute[226886]: 2026-01-20 14:40:10.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:10 np0005588920 nova_compute[226886]: 2026-01-20 14:40:10.636 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:10.763 144287 DEBUG eventlet.wsgi.server [-] (144287) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:10.765 144287 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: Accept: */*#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: Connection: close#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: Content-Type: text/plain#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: Host: 169.254.169.254#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: User-Agent: curl/7.84.0#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: X-Forwarded-For: 10.100.0.8#015
Jan 20 09:40:10 np0005588920 ovn_metadata_agent[144123]: X-Ovn-Network-Id: b3a21065-10a5-474d-b42f-ffe66242a479 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 20 09:40:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:11.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:11.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:12.288 144287 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 20 09:40:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:12.289 144287 INFO eventlet.wsgi.server [-] 10.100.0.8,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2546 time: 1.5242741#033[00m
Jan 20 09:40:12 np0005588920 haproxy-metadata-proxy-b3a21065-10a5-474d-b42f-ffe66242a479[249872]: 10.100.0.8:40392 [20/Jan/2026:14:40:10.762] listener listener/metadata 0/0/0/1526/1526 200 2530 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 20 09:40:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:13.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.246 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.246 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.247 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.247 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.247 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.248 226890 INFO nova.compute.manager [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Terminating instance#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.249 226890 DEBUG nova.compute.manager [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:40:13 np0005588920 kernel: tap9b0fc629-48 (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.5314] device (tap9b0fc629-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.571 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00221|binding|INFO|Releasing lport 9b0fc629-48f5-469d-90d2-c26339c16eec from this chassis (sb_readonly=0)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00222|binding|INFO|Setting lport 9b0fc629-48f5-469d-90d2-c26339c16eec down in Southbound
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00223|binding|INFO|Removing iface tap9b0fc629-48 ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tap96b6b319-e9 (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.5892] device (tap96b6b319-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.597 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00224|binding|INFO|Releasing lport 96b6b319-e96c-4182-b940-9f154499e22d from this chassis (sb_readonly=1)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00225|binding|INFO|Removing iface tap96b6b319-e9 ovn-installed in OVS
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00226|if_status|INFO|Not setting lport 96b6b319-e96c-4182-b940-9f154499e22d down as sb is readonly
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tapa6658829-ef (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.6194] device (tapa6658829-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00227|binding|INFO|Setting lport 96b6b319-e96c-4182-b940-9f154499e22d down in Southbound
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00228|binding|INFO|Releasing lport a6658829-ef19-4914-bbb3-35b718691c7c from this chassis (sb_readonly=1)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00229|binding|INFO|Removing iface tapa6658829-ef ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.633 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:9c:9c 10.100.0.8'], port_security=['fa:16:3e:11:9c:9c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3a21065-10a5-474d-b42f-ffe66242a479', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=786e8670-14a0-43fd-9be5-ba76f5969fd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=9b0fc629-48f5-469d-90d2-c26339c16eec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.636 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 9b0fc629-48f5-469d-90d2-c26339c16eec in datapath b3a21065-10a5-474d-b42f-ffe66242a479 unbound from our chassis#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00230|binding|INFO|Setting lport a6658829-ef19-4914-bbb3-35b718691c7c down in Southbound
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.637 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3a21065-10a5-474d-b42f-ffe66242a479, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.638 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:ef:b0 10.1.1.91'], port_security=['fa:16:3e:1a:ef:b0 10.1.1.91'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-791226081', 'neutron:cidrs': '10.1.1.91/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-791226081', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99955626-758a-4f99-af26-9b1cc95cd9d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=96b6b319-e96c-4182-b940-9f154499e22d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.638 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a37ac71c-12d7-47d3-924b-08c068f7c9f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.639 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479 namespace which is not needed anymore#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.642 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:a5:f5 10.1.1.112'], port_security=['fa:16:3e:c8:a5:f5 10.1.1.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-2010021643', 'neutron:cidrs': '10.1.1.112/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-2010021643', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99955626-758a-4f99-af26-9b1cc95cd9d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=a6658829-ef19-4914-bbb3-35b718691c7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tapb03b1a08-92 (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.6516] device (tapb03b1a08-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00231|binding|INFO|Releasing lport b03b1a08-920f-4340-b77b-37669cc14a07 from this chassis (sb_readonly=0)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00232|binding|INFO|Setting lport b03b1a08-920f-4340-b77b-37669cc14a07 down in Southbound
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00233|binding|INFO|Removing iface tapb03b1a08-92 ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.671 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.676 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:ca:05 10.1.1.214'], port_security=['fa:16:3e:b9:ca:05 10.1.1.214'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.214/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b03b1a08-920f-4340-b77b-37669cc14a07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 kernel: tap7a84a919-d3 (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.6873] device (tap7a84a919-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00234|binding|INFO|Releasing lport 7a84a919-d309-4f1a-99b8-4792dcdec990 from this chassis (sb_readonly=0)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00235|binding|INFO|Setting lport 7a84a919-d309-4f1a-99b8-4792dcdec990 down in Southbound
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00236|binding|INFO|Removing iface tap7a84a919-d3 ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.702 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.704 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tapff70124b-be (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.7191] device (tapff70124b-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.720 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.730 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:9a:c3 10.1.1.32'], port_security=['fa:16:3e:c7:9a:c3 10.1.1.32'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.32/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10ccc6-c168-4af1-ac9c-92f5959feefb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7a84a919-d309-4f1a-99b8-4792dcdec990) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.735 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00237|binding|INFO|Releasing lport ff70124b-befc-46fa-b2cb-bc4bd4a49942 from this chassis (sb_readonly=0)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00238|binding|INFO|Setting lport ff70124b-befc-46fa-b2cb-bc4bd4a49942 down in Southbound
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00239|binding|INFO|Removing iface tapff70124b-be ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tap5096b763-1b (unregistering): left promiscuous mode
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.7520] device (tap5096b763-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.754 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.771 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00240|binding|INFO|Releasing lport 5096b763-1b08-448f-a6bd-f63bcd65def6 from this chassis (sb_readonly=1)
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00241|binding|INFO|Removing iface tap5096b763-1b ovn-installed in OVS
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.773 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:40:13Z|00242|binding|INFO|Setting lport 5096b763-1b08-448f-a6bd-f63bcd65def6 down in Southbound
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.800 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d4:a5 10.2.2.100'], port_security=['fa:16:3e:97:d4:a5 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a477e986-22d6-46cb-827e-2c814ecbcffa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ff70124b-befc-46fa-b2cb-bc4bd4a49942) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.802 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:13.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:13 np0005588920 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 20 09:40:13 np0005588920 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000040.scope: Consumed 16.673s CPU time.
Jan 20 09:40:13 np0005588920 systemd-machined[196121]: Machine qemu-27-instance-00000040 terminated.
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [NOTICE]   (249870) : haproxy version is 2.8.14-c23fe91
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [NOTICE]   (249870) : path to executable is /usr/sbin/haproxy
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [WARNING]  (249870) : Exiting Master process...
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [WARNING]  (249870) : Exiting Master process...
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [ALERT]    (249870) : Current worker (249872) exited with code 143 (Terminated)
Jan 20 09:40:13 np0005588920 neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479[249866]: [WARNING]  (249870) : All workers exited. Exiting... (0)
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.823 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:80:45 10.2.2.200'], port_security=['fa:16:3e:5e:80:45 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '795b0a95-448b-49b1-80cb-a18e84101480', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8705404c3964472782118e478eb54e51', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9cf1a477-710b-4936-b92a-ce2a6ea51d41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a477e986-22d6-46cb-827e-2c814ecbcffa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5096b763-1b08-448f-a6bd-f63bcd65def6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:13 np0005588920 systemd[1]: libpod-84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf.scope: Deactivated successfully.
Jan 20 09:40:13 np0005588920 podman[250155]: 2026-01-20 14:40:13.831489072 +0000 UTC m=+0.099304919 container died 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:40:13 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf-userdata-shm.mount: Deactivated successfully.
Jan 20 09:40:13 np0005588920 systemd[1]: var-lib-containers-storage-overlay-b817755456659e24bdeebbbd56ba4e937c79185bfa874a84684085b765f21a2d-merged.mount: Deactivated successfully.
Jan 20 09:40:13 np0005588920 podman[250155]: 2026-01-20 14:40:13.864388919 +0000 UTC m=+0.132204776 container cleanup 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.8698] manager: (tap9b0fc629-48): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Jan 20 09:40:13 np0005588920 systemd[1]: libpod-conmon-84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf.scope: Deactivated successfully.
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.8863] manager: (tap96b6b319-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.9265] manager: (tapff70124b-be): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 20 09:40:13 np0005588920 podman[250208]: 2026-01-20 14:40:13.926905582 +0000 UTC m=+0.041053175 container remove 84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.932 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c25537e5-dc91-435f-9bc9-570c15e5b221]: (4, ('Tue Jan 20 02:40:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479 (84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf)\n84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf\nTue Jan 20 02:40:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479 (84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf)\n84fb5f99fc9a47e6bcdd0d11fe6080861313943298fd99b3df678b48b51c0fcf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 NetworkManager[49076]: <info>  [1768920013.9344] manager: (tap5096b763-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.934 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0332bbd0-d021-4145-ae0c-cac26ebb7385]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.935 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3a21065-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.938 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.947 226890 INFO nova.virt.libvirt.driver [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Instance destroyed successfully.#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.948 226890 DEBUG nova.objects.instance [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lazy-loading 'resources' on Instance uuid 795b0a95-448b-49b1-80cb-a18e84101480 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 kernel: tapb3a21065-10: left promiscuous mode
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.966 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.967 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.967 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.967 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.969 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.969 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b0fc629-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.972 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.973 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[56c44674-e2bf-4d9b-b262-be6146ba67bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.986 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.990 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:9c:9c,bridge_name='br-int',has_traffic_filtering=True,id=9b0fc629-48f5-469d-90d2-c26339c16eec,network=Network(b3a21065-10a5-474d-b42f-ffe66242a479),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b0fc629-48')#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.990 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.990 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.991 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.991 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.992 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96b6b319-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.993 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.994 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[01fd8105-8678-40f8-b0e6-7a8d7c93ec6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:13.995 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3334e3d6-3c2f-46a1-afe8-96d289c42005]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:13 np0005588920 nova_compute[226886]: 2026-01-20 14:40:13.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.009 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d12fff60-6832-4bad-8c93-77e4220d19da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501332, 'reachable_time': 27284, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250304, 'error': None, 'target': 'ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2db3a21065\x2d10a5\x2d474d\x2db42f\x2dffe66242a479.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.011 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3a21065-10a5-474d-b42f-ffe66242a479 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.011 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5831eb7c-c399-4128-9d18-1a6e437061e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.013 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 96b6b319-e96c-4182-b940-9f154499e22d in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.014 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.015 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e44f4b22-09d7-4293-92ea-268c8c5911b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.015 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 namespace which is not needed anymore#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.018 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:ef:b0,bridge_name='br-int',has_traffic_filtering=True,id=96b6b319-e96c-4182-b940-9f154499e22d,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96b6b319-e9')#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.019 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.019 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.020 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.020 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.021 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.021 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6658829-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.022 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.035 226890 DEBUG nova.compute.manager [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.035 226890 DEBUG oslo_concurrency.lockutils [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.035 226890 DEBUG oslo_concurrency.lockutils [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.036 226890 DEBUG oslo_concurrency.lockutils [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.036 226890 DEBUG nova.compute.manager [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-b03b1a08-920f-4340-b77b-37669cc14a07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.036 226890 DEBUG nova.compute.manager [req-ede1ef48-da1a-49b9-95f4-766b8b21d93d req-11e8444c-c4a6-425a-ad07-fd1fae3a0c7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-b03b1a08-920f-4340-b77b-37669cc14a07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.037 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=a6658829-ef19-4914-bbb3-35b718691c7c,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa6658829-ef')#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.037 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.038 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.038 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.038 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.039 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.039 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03b1a08-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.040 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.042 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.049 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.050 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ca:05,bridge_name='br-int',has_traffic_filtering=True,id=b03b1a08-920f-4340-b77b-37669cc14a07,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03b1a08-92')#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.051 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.051 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.052 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.052 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.053 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.053 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a84a919-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.057 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.063 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:9a:c3,bridge_name='br-int',has_traffic_filtering=True,id=7a84a919-d309-4f1a-99b8-4792dcdec990,network=Network(d12cbe78-47c0-4f23-98a7-4ed621f8c3a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a84a919-d3')#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.064 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.064 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "address": "fa:16:3e:97:d4:a5", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff70124b-be", "ovs_interfaceid": "ff70124b-befc-46fa-b2cb-bc4bd4a49942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.065 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.065 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.066 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.066 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff70124b-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.067 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.071 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.073 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d4:a5,bridge_name='br-int',has_traffic_filtering=True,id=ff70124b-befc-46fa-b2cb-bc4bd4a49942,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff70124b-be')#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.073 226890 DEBUG nova.virt.libvirt.vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-68911616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-68911616',id=64,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPmqoi4p8kSyWXNxQ3a/6cHt6bcPdJIB4+7iQjfjSS/GZvQEk00ft0q8g9eYHEm/6qNbWlrQRShcWhErCzsftWYt7Pg9lwI5WvcUf4Z28u7I0nTtYrQ91Z0PLrzqP57aiA==',key_name='tempest-keypair-396780534',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8705404c3964472782118e478eb54e51',ramdisk_id='',reservation_id='r-q1o120zj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1563719502',owner_user_name='tempest-TaggedBootDevicesTest_v242-1563719502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd4ba32a01f74af199438da0b72e5a4d',uuid=795b0a95-448b-49b1-80cb-a18e84101480,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.073 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converting VIF {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.074 226890 DEBUG nova.network.os_vif_util [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.074 226890 DEBUG os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.075 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5096b763-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.076 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.077 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.079 226890 INFO os_vif [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:80:45,bridge_name='br-int',has_traffic_filtering=True,id=5096b763-1b08-448f-a6bd-f63bcd65def6,network=Network(a36edd9d-12ce-4779-97fe-f75c00d85dcd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5096b763-1b')#033[00m
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [NOTICE]   (249777) : haproxy version is 2.8.14-c23fe91
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [NOTICE]   (249777) : path to executable is /usr/sbin/haproxy
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [WARNING]  (249777) : Exiting Master process...
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [ALERT]    (249777) : Current worker (249779) exited with code 143 (Terminated)
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3[249773]: [WARNING]  (249777) : All workers exited. Exiting... (0)
Jan 20 09:40:14 np0005588920 systemd[1]: libpod-75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97.scope: Deactivated successfully.
Jan 20 09:40:14 np0005588920 podman[250338]: 2026-01-20 14:40:14.138616054 +0000 UTC m=+0.039011638 container died 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:40:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97-userdata-shm.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c645bc67a6a8362f49f9aa55f94e8c822fcd92599a6c86d5356ba7d9f9472734-merged.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 podman[250338]: 2026-01-20 14:40:14.175362939 +0000 UTC m=+0.075758523 container cleanup 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:40:14 np0005588920 systemd[1]: libpod-conmon-75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97.scope: Deactivated successfully.
Jan 20 09:40:14 np0005588920 podman[250387]: 2026-01-20 14:40:14.233599072 +0000 UTC m=+0.039383409 container remove 75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.238 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6494a374-699b-40e3-bbb4-41acf2a97de1]: (4, ('Tue Jan 20 02:40:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 (75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97)\n75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97\nTue Jan 20 02:40:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 (75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97)\n75c2e5e1746b49218dbd4bf0f04997dcc4c57a444abaa8ba57c893cc1dfc8c97\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.239 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[96ba0884-9ad7-490d-bf3d-832281f02142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.240 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12cbe78-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 kernel: tapd12cbe78-40: left promiscuous mode
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.255 226890 INFO nova.virt.libvirt.driver [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Deleting instance files /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480_del#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.256 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c877673-d81a-4e12-b260-024476fabe75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.256 226890 INFO nova.virt.libvirt.driver [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Deletion of /var/lib/nova/instances/795b0a95-448b-49b1-80cb-a18e84101480_del complete#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.259 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.269 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[00edc885-f006-40e9-be48-ab6df147869b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.270 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[77dd4711-c85f-4670-8884-7137c9086044]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.283 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d47ccd4e-d976-48bd-a822-5c2e53775065]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501246, 'reachable_time': 38141, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250403, 'error': None, 'target': 'ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.284 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.284 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[da7c3a12-81b5-4d9f-991c-b53d086a6ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.285 144128 INFO neutron.agent.ovn.metadata.agent [-] Port a6658829-ef19-4914-bbb3-35b718691c7c in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.286 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.287 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[af50208b-9277-4458-8b7c-bee3c961fb65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.287 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b03b1a08-920f-4340-b77b-37669cc14a07 in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.288 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.289 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4863a049-d6ae-4753-8687-d39af0786175]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.289 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7a84a919-d309-4f1a-99b8-4792dcdec990 in datapath d12cbe78-47c0-4f23-98a7-4ed621f8c3a3 unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.290 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d12cbe78-47c0-4f23-98a7-4ed621f8c3a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.291 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dd643ad0-3985-4459-a82d-ddd423a6dedb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.291 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ff70124b-befc-46fa-b2cb-bc4bd4a49942 in datapath a36edd9d-12ce-4779-97fe-f75c00d85dcd unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.292 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a36edd9d-12ce-4779-97fe-f75c00d85dcd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.293 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[346c3f54-b360-49ec-8d5e-d7198cda20fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.293 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd namespace which is not needed anymore#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.318 226890 INFO nova.compute.manager [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Took 1.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.319 226890 DEBUG oslo.service.loopingcall [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.319 226890 DEBUG nova.compute.manager [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.320 226890 DEBUG nova.network.neutron [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [NOTICE]   (249982) : haproxy version is 2.8.14-c23fe91
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [NOTICE]   (249982) : path to executable is /usr/sbin/haproxy
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [WARNING]  (249982) : Exiting Master process...
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [ALERT]    (249982) : Current worker (249984) exited with code 143 (Terminated)
Jan 20 09:40:14 np0005588920 neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd[249978]: [WARNING]  (249982) : All workers exited. Exiting... (0)
Jan 20 09:40:14 np0005588920 systemd[1]: libpod-2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2.scope: Deactivated successfully.
Jan 20 09:40:14 np0005588920 podman[250422]: 2026-01-20 14:40:14.406183534 +0000 UTC m=+0.038796403 container died 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:40:14 np0005588920 podman[250422]: 2026-01-20 14:40:14.441821907 +0000 UTC m=+0.074434926 container cleanup 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:40:14 np0005588920 systemd[1]: libpod-conmon-2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2.scope: Deactivated successfully.
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.479 226890 DEBUG nova.compute.manager [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.479 226890 DEBUG oslo_concurrency.lockutils [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.479 226890 DEBUG oslo_concurrency.lockutils [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.480 226890 DEBUG oslo_concurrency.lockutils [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.480 226890 DEBUG nova.compute.manager [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-7a84a919-d309-4f1a-99b8-4792dcdec990 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.480 226890 DEBUG nova.compute.manager [req-7003d4a6-6212-4a6e-9ddc-8575e6daab11 req-697e950f-f77b-45e1-8454-e66eb704f67b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-7a84a919-d309-4f1a-99b8-4792dcdec990 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:14 np0005588920 podman[250453]: 2026-01-20 14:40:14.499822994 +0000 UTC m=+0.040221292 container remove 2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.504 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[95d6ecc3-8c24-47d8-9af1-f077c4261e64]: (4, ('Tue Jan 20 02:40:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd (2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2)\n2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2\nTue Jan 20 02:40:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd (2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2)\n2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.506 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dc334a7d-0dae-42f2-a8f6-5e3fa00c21dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.507 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36edd9d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 kernel: tapa36edd9d-10: left promiscuous mode
Jan 20 09:40:14 np0005588920 nova_compute[226886]: 2026-01-20 14:40:14.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.539 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b986906-cd1a-43b1-9806-d7e457274db2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.559 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b6c82c-2438-4b8f-a429-1fbe82c852aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.560 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dd617cff-51e3-46cd-9ca1-72b982e04c2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.575 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7655090b-085e-4483-a15a-3b29b0ecbe63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501444, 'reachable_time': 25754, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250468, 'error': None, 'target': 'ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.579 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a36edd9d-12ce-4779-97fe-f75c00d85dcd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.579 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[69a0066e-0479-4a73-a09b-0d937ff92a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.580 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5096b763-1b08-448f-a6bd-f63bcd65def6 in datapath a36edd9d-12ce-4779-97fe-f75c00d85dcd unbound from our chassis#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.582 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a36edd9d-12ce-4779-97fe-f75c00d85dcd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:14.583 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ffc2b1-809a-47c3-bcec-608e5b867cee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-9a0924e85c686d8695d859d46be9da4f5e6c4cc11a5cef51ccca9b849a2f1e0f-merged.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e6c730101f45f795ca0e3f94bad37e770b3dfac08300eb7c54a40bd9657b8e2-userdata-shm.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2da36edd9d\x2d12ce\x2d4779\x2d97fe\x2df75c00d85dcd.mount: Deactivated successfully.
Jan 20 09:40:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2dd12cbe78\x2d47c0\x2d4f23\x2d98a7\x2d4ed621f8c3a3.mount: Deactivated successfully.
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:15.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.499 226890 DEBUG nova.compute.manager [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-96b6b319-e96c-4182-b940-9f154499e22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.499 226890 DEBUG oslo_concurrency.lockutils [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.500 226890 DEBUG oslo_concurrency.lockutils [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.500 226890 DEBUG oslo_concurrency.lockutils [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.500 226890 DEBUG nova.compute.manager [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-96b6b319-e96c-4182-b940-9f154499e22d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.500 226890 DEBUG nova.compute.manager [req-69b3e2eb-c277-4215-b973-3c859ebdfe05 req-0a014ef5-fe7e-4653-93ca-854990060307 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-96b6b319-e96c-4182-b940-9f154499e22d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.664 226890 DEBUG nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.665 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.666 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.666 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.666 226890 DEBUG nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-9b0fc629-48f5-469d-90d2-c26339c16eec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.667 226890 DEBUG nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-9b0fc629-48f5-469d-90d2-c26339c16eec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.667 226890 DEBUG nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.667 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.667 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.668 226890 DEBUG oslo_concurrency.lockutils [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.668 226890 DEBUG nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:15 np0005588920 nova_compute[226886]: 2026-01-20 14:40:15.668 226890 WARNING nova.compute.manager [req-d277bfc6-045f-4476-9863-bd5938769b3c req-03469c4a-83b1-485c-abcd-f9f35f27b65a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-9b0fc629-48f5-469d-90d2-c26339c16eec for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:15.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.143 226890 DEBUG nova.compute.manager [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.144 226890 DEBUG oslo_concurrency.lockutils [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.145 226890 DEBUG oslo_concurrency.lockutils [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.146 226890 DEBUG oslo_concurrency.lockutils [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.146 226890 DEBUG nova.compute.manager [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.147 226890 WARNING nova.compute.manager [req-1f1492a3-6f34-455c-8fad-c4042a6bf7b4 req-d6b0be0c-003b-44b1-8baa-0b3ddd3d3f17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-b03b1a08-920f-4340-b77b-37669cc14a07 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:16.442 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:16.443 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:16.443 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.584 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.585 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.585 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.585 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.585 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.585 226890 WARNING nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-7a84a919-d309-4f1a-99b8-4792dcdec990 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.586 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.586 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.586 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.586 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.587 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.587 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.587 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.587 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.588 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.588 226890 DEBUG oslo_concurrency.lockutils [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.588 226890 DEBUG nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:16 np0005588920 nova_compute[226886]: 2026-01-20 14:40:16.588 226890 WARNING nova.compute.manager [req-239355bb-a8ad-447a-98c4-ca93fc237460 req-11c44948-5b04-4540-b4e5-b09c45b9b44d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-ff70124b-befc-46fa-b2cb-bc4bd4a49942 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:17.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.637 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.638 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.638 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.639 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.639 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.640 226890 WARNING nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-96b6b319-e96c-4182-b940-9f154499e22d for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.640 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-a6658829-ef19-4914-bbb3-35b718691c7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.640 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.641 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.641 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.641 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-a6658829-ef19-4914-bbb3-35b718691c7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.642 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-a6658829-ef19-4914-bbb3-35b718691c7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.642 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.642 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.643 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.643 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.643 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.644 226890 WARNING nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-a6658829-ef19-4914-bbb3-35b718691c7c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.644 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.644 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.645 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.645 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.645 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-unplugged-5096b763-1b08-448f-a6bd-f63bcd65def6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.646 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-unplugged-5096b763-1b08-448f-a6bd-f63bcd65def6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.646 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.646 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "795b0a95-448b-49b1-80cb-a18e84101480-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.646 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.647 226890 DEBUG oslo_concurrency.lockutils [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.647 226890 DEBUG nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] No waiting events found dispatching network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:17 np0005588920 nova_compute[226886]: 2026-01-20 14:40:17.647 226890 WARNING nova.compute.manager [req-e6807e46-ee6a-45c9-8854-de7e4e41f096 req-c06b6ddd-538b-4577-93bb-d38aeeef2d05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received unexpected event network-vif-plugged-5096b763-1b08-448f-a6bd-f63bcd65def6 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:17.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:19.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.756 226890 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-deleted-ff70124b-befc-46fa-b2cb-bc4bd4a49942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.757 226890 INFO nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Neutron deleted interface ff70124b-befc-46fa-b2cb-bc4bd4a49942; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.757 226890 DEBUG nova.network.neutron [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "b03b1a08-920f-4340-b77b-37669cc14a07", "address": "fa:16:3e:b9:ca:05", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03b1a08-92", "ovs_interfaceid": "b03b1a08-920f-4340-b77b-37669cc14a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.798 226890 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Detach interface failed, port_id=ff70124b-befc-46fa-b2cb-bc4bd4a49942, reason: Instance 795b0a95-448b-49b1-80cb-a18e84101480 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.798 226890 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-deleted-b03b1a08-920f-4340-b77b-37669cc14a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.799 226890 INFO nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Neutron deleted interface b03b1a08-920f-4340-b77b-37669cc14a07; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.799 226890 DEBUG nova.network.neutron [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [{"id": "9b0fc629-48f5-469d-90d2-c26339c16eec", "address": "fa:16:3e:11:9c:9c", "network": {"id": "b3a21065-10a5-474d-b42f-ffe66242a479", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-807218413-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b0fc629-48", "ovs_interfaceid": "9b0fc629-48f5-469d-90d2-c26339c16eec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "96b6b319-e96c-4182-b940-9f154499e22d", "address": "fa:16:3e:1a:ef:b0", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.91", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96b6b319-e9", "ovs_interfaceid": "96b6b319-e96c-4182-b940-9f154499e22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a6658829-ef19-4914-bbb3-35b718691c7c", "address": "fa:16:3e:c8:a5:f5", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6658829-ef", "ovs_interfaceid": "a6658829-ef19-4914-bbb3-35b718691c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "7a84a919-d309-4f1a-99b8-4792dcdec990", "address": "fa:16:3e:c7:9a:c3", "network": {"id": "d12cbe78-47c0-4f23-98a7-4ed621f8c3a3", "bridge": "br-int", "label": "tempest-device-tagging-net1-346296188", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a84a919-d3", "ovs_interfaceid": "7a84a919-d309-4f1a-99b8-4792dcdec990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5096b763-1b08-448f-a6bd-f63bcd65def6", "address": "fa:16:3e:5e:80:45", "network": {"id": "a36edd9d-12ce-4779-97fe-f75c00d85dcd", "bridge": "br-int", "label": "tempest-device-tagging-net2-56635846", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8705404c3964472782118e478eb54e51", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5096b763-1b", "ovs_interfaceid": "5096b763-1b08-448f-a6bd-f63bcd65def6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:19.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:19 np0005588920 nova_compute[226886]: 2026-01-20 14:40:19.823 226890 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Detach interface failed, port_id=b03b1a08-920f-4340-b77b-37669cc14a07, reason: Instance 795b0a95-448b-49b1-80cb-a18e84101480 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:40:20 np0005588920 nova_compute[226886]: 2026-01-20 14:40:20.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:21 np0005588920 nova_compute[226886]: 2026-01-20 14:40:21.365 226890 DEBUG nova.network.neutron [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:21 np0005588920 nova_compute[226886]: 2026-01-20 14:40:21.382 226890 INFO nova.compute.manager [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Took 7.06 seconds to deallocate network for instance.#033[00m
Jan 20 09:40:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:21.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:21 np0005588920 nova_compute[226886]: 2026-01-20 14:40:21.867 226890 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-deleted-5096b763-1b08-448f-a6bd-f63bcd65def6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:21 np0005588920 nova_compute[226886]: 2026-01-20 14:40:21.867 226890 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-deleted-9b0fc629-48f5-469d-90d2-c26339c16eec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:21 np0005588920 nova_compute[226886]: 2026-01-20 14:40:21.867 226890 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Received event network-vif-deleted-7a84a919-d309-4f1a-99b8-4792dcdec990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.039 226890 INFO nova.compute.manager [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Took 0.66 seconds to detach 3 volumes for instance.#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.076 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.076 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.147 226890 DEBUG oslo_concurrency.processutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/488662490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.640 226890 DEBUG oslo_concurrency.processutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.648 226890 DEBUG nova.compute.provider_tree [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.678 226890 DEBUG nova.scheduler.client.report [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.700 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.736 226890 INFO nova.scheduler.client.report [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Deleted allocations for instance 795b0a95-448b-49b1-80cb-a18e84101480#033[00m
Jan 20 09:40:22 np0005588920 nova_compute[226886]: 2026-01-20 14:40:22.849 226890 DEBUG oslo_concurrency.lockutils [None req-304b9434-e287-4721-b82f-f88c15814d37 cd4ba32a01f74af199438da0b72e5a4d 8705404c3964472782118e478eb54e51 - - default default] Lock "795b0a95-448b-49b1-80cb-a18e84101480" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:23.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:23.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:24 np0005588920 podman[250491]: 2026-01-20 14:40:24.054261193 +0000 UTC m=+0.119104811 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:40:24 np0005588920 nova_compute[226886]: 2026-01-20 14:40:24.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:24 np0005588920 nova_compute[226886]: 2026-01-20 14:40:24.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:24 np0005588920 nova_compute[226886]: 2026-01-20 14:40:24.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:40:24 np0005588920 nova_compute[226886]: 2026-01-20 14:40:24.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:40:24 np0005588920 nova_compute[226886]: 2026-01-20 14:40:24.746 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:40:25 np0005588920 nova_compute[226886]: 2026-01-20 14:40:25.094 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:25.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:25 np0005588920 nova_compute[226886]: 2026-01-20 14:40:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:25.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:26 np0005588920 nova_compute[226886]: 2026-01-20 14:40:26.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:26 np0005588920 nova_compute[226886]: 2026-01-20 14:40:26.750 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:27.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:27.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:28 np0005588920 nova_compute[226886]: 2026-01-20 14:40:28.947 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920013.9457324, 795b0a95-448b-49b1-80cb-a18e84101480 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:40:28 np0005588920 nova_compute[226886]: 2026-01-20 14:40:28.948 226890 INFO nova.compute.manager [-] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:40:28 np0005588920 nova_compute[226886]: 2026-01-20 14:40:28.967 226890 DEBUG nova.compute.manager [None req-79cb8d89-a166-4c42-a722-55089bed871f - - - - - -] [instance: 795b0a95-448b-49b1-80cb-a18e84101480] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:40:29 np0005588920 nova_compute[226886]: 2026-01-20 14:40:29.112 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:29.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:29 np0005588920 nova_compute[226886]: 2026-01-20 14:40:29.767 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:29.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.095 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.099 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.768 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.768 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.769 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.769 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:40:30 np0005588920 nova_compute[226886]: 2026-01-20 14:40:30.769 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:31.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1270905783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.210 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.375 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.376 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4597MB free_disk=20.876895904541016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.376 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.376 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.463 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.464 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.488 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2878926089' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2878926089' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:40:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:31.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2590034195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.957 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.963 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.978 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.998 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:40:31 np0005588920 nova_compute[226886]: 2026-01-20 14:40:31.999 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:32 np0005588920 nova_compute[226886]: 2026-01-20 14:40:32.993 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:32 np0005588920 nova_compute[226886]: 2026-01-20 14:40:32.994 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:32 np0005588920 nova_compute[226886]: 2026-01-20 14:40:32.994 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:33.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:40:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3830778880' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:40:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:40:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3830778880' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:40:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:33.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:34 np0005588920 nova_compute[226886]: 2026-01-20 14:40:34.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:35 np0005588920 nova_compute[226886]: 2026-01-20 14:40:35.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:35.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:35.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:40:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1847975642' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:40:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:40:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1847975642' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:40:36 np0005588920 podman[250695]: 2026-01-20 14:40:36.964119265 +0000 UTC m=+0.053632016 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:40:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:37.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:37.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:37.854 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:37 np0005588920 nova_compute[226886]: 2026-01-20 14:40:37.855 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:37.855 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:40:39 np0005588920 nova_compute[226886]: 2026-01-20 14:40:39.118 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:39.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:39.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:40 np0005588920 nova_compute[226886]: 2026-01-20 14:40:40.099 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:41.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:41.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:43.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:43.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:40:43.857 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588920 nova_compute[226886]: 2026-01-20 14:40:44.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588920 nova_compute[226886]: 2026-01-20 14:40:45.101 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:45.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:45.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:47.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:47.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:49 np0005588920 nova_compute[226886]: 2026-01-20 14:40:49.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:49.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:49.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:50 np0005588920 nova_compute[226886]: 2026-01-20 14:40:50.103 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:51.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:51.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:53.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:53.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:54 np0005588920 nova_compute[226886]: 2026-01-20 14:40:54.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:54 np0005588920 podman[250766]: 2026-01-20 14:40:54.982898249 +0000 UTC m=+0.074538579 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:40:55 np0005588920 nova_compute[226886]: 2026-01-20 14:40:55.104 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:55.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:55.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:57.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:57.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:59 np0005588920 nova_compute[226886]: 2026-01-20 14:40:59.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:59.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:40:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:59.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:00 np0005588920 nova_compute[226886]: 2026-01-20 14:41:00.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:00 np0005588920 nova_compute[226886]: 2026-01-20 14:41:00.867 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:00 np0005588920 nova_compute[226886]: 2026-01-20 14:41:00.867 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:00 np0005588920 nova_compute[226886]: 2026-01-20 14:41:00.888 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:41:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.003 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.004 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.012 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.012 226890 INFO nova.compute.claims [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.125 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:01.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1609353612' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.561 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.567 226890 DEBUG nova.compute.provider_tree [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.590 226890 DEBUG nova.scheduler.client.report [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.620 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.621 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.665 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.666 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.687 226890 INFO nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.711 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.843 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.845 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:41:01 np0005588920 nova_compute[226886]: 2026-01-20 14:41:01.845 226890 INFO nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Creating image(s)#033[00m
Jan 20 09:41:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:01.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.004 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.030 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.055 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.059 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.117 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.118 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.119 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.120 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.147 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.152 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6e4afbc3-37b1-4657-b152-91645facfcca_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.177 226890 DEBUG nova.policy [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e3fb126d8254300b5f6f408fceefb19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c8c509d8e23246e1a509bf2197b73ebf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.772 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6e4afbc3-37b1-4657-b152-91645facfcca_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.620s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.840 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] resizing rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:41:02 np0005588920 nova_compute[226886]: 2026-01-20 14:41:02.971 226890 DEBUG nova.objects.instance [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'migration_context' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.001 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.001 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Ensure instance console log exists: /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.002 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.002 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.002 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:03 np0005588920 nova_compute[226886]: 2026-01-20 14:41:03.048 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Successfully created port: b798b69c-a652-408f-810b-0a1d3d9e324c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:41:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:03.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:03.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.167 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.343 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Successfully updated port: b798b69c-a652-408f-810b-0a1d3d9e324c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.377 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.377 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquired lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.378 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:41:04 np0005588920 nova_compute[226886]: 2026-01-20 14:41:04.628 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:41:05 np0005588920 nova_compute[226886]: 2026-01-20 14:41:05.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:05.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:05.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.088 226890 DEBUG nova.network.neutron [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updating instance_info_cache with network_info: [{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.119 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Releasing lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.119 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance network_info: |[{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.122 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start _get_guest_xml network_info=[{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.126 226890 DEBUG nova.compute.manager [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-changed-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.126 226890 DEBUG nova.compute.manager [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Refreshing instance network info cache due to event network-changed-b798b69c-a652-408f-810b-0a1d3d9e324c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.126 226890 DEBUG oslo_concurrency.lockutils [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.127 226890 DEBUG oslo_concurrency.lockutils [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.127 226890 DEBUG nova.network.neutron [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Refreshing network info cache for port b798b69c-a652-408f-810b-0a1d3d9e324c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.134 226890 WARNING nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.138 226890 DEBUG nova.virt.libvirt.host [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.139 226890 DEBUG nova.virt.libvirt.host [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.148 226890 DEBUG nova.virt.libvirt.host [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.148 226890 DEBUG nova.virt.libvirt.host [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.149 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.150 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.150 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.150 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.151 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.151 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.151 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.151 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.151 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.152 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.152 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.152 226890 DEBUG nova.virt.hardware [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.154 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1173221353' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.562 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.582 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.586 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2638918246' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.996 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.998 226890 DEBUG nova.virt.libvirt.vif [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:01Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.998 226890 DEBUG nova.network.os_vif_util [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:06 np0005588920 nova_compute[226886]: 2026-01-20 14:41:06.999 226890 DEBUG nova.network.os_vif_util [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.000 226890 DEBUG nova.objects.instance [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.019 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <uuid>6e4afbc3-37b1-4657-b152-91645facfcca</uuid>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <name>instance-00000046</name>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:name>tempest-InstanceActionsTestJSON-server-201944342</nova:name>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:41:06</nova:creationTime>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:user uuid="7e3fb126d8254300b5f6f408fceefb19">tempest-InstanceActionsTestJSON-1406975575-project-member</nova:user>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:project uuid="c8c509d8e23246e1a509bf2197b73ebf">tempest-InstanceActionsTestJSON-1406975575</nova:project>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <nova:port uuid="b798b69c-a652-408f-810b-0a1d3d9e324c">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="serial">6e4afbc3-37b1-4657-b152-91645facfcca</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="uuid">6e4afbc3-37b1-4657-b152-91645facfcca</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6e4afbc3-37b1-4657-b152-91645facfcca_disk">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6e4afbc3-37b1-4657-b152-91645facfcca_disk.config">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1f:5b:64"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <target dev="tapb798b69c-a6"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/console.log" append="off"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:41:07 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:41:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:41:07 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:41:07 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.019 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Preparing to wait for external event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.019 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.019 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.020 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.020 226890 DEBUG nova.virt.libvirt.vif [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:01Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.020 226890 DEBUG nova.network.os_vif_util [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.021 226890 DEBUG nova.network.os_vif_util [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.021 226890 DEBUG os_vif [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.022 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.022 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.022 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.025 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb798b69c-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.025 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb798b69c-a6, col_values=(('external_ids', {'iface-id': 'b798b69c-a652-408f-810b-0a1d3d9e324c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:5b:64', 'vm-uuid': '6e4afbc3-37b1-4657-b152-91645facfcca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.026 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:07 np0005588920 NetworkManager[49076]: <info>  [1768920067.0273] manager: (tapb798b69c-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.028 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.032 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.033 226890 INFO os_vif [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6')#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.133 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.134 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.135 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] No VIF found with MAC fa:16:3e:1f:5b:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.135 226890 INFO nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Using config drive#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.155 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:07.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.633 226890 INFO nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Creating config drive at /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.639 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxj_el3x5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.769 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxj_el3x5" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.791 226890 DEBUG nova.storage.rbd_utils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] rbd image 6e4afbc3-37b1-4657-b152-91645facfcca_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:07 np0005588920 nova_compute[226886]: 2026-01-20 14:41:07.795 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config 6e4afbc3-37b1-4657-b152-91645facfcca_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:07.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:07 np0005588920 podman[251100]: 2026-01-20 14:41:07.952122158 +0000 UTC m=+0.043993767 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.043 226890 DEBUG nova.network.neutron [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updated VIF entry in instance network info cache for port b798b69c-a652-408f-810b-0a1d3d9e324c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.044 226890 DEBUG nova.network.neutron [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updating instance_info_cache with network_info: [{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.068 226890 DEBUG oslo_concurrency.lockutils [req-87403f35-c713-43ed-bfae-db4d182cb2fa req-96aec9ea-b236-4614-b7d4-56cc7ba5b3ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.166 226890 DEBUG oslo_concurrency.processutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config 6e4afbc3-37b1-4657-b152-91645facfcca_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.167 226890 INFO nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Deleting local config drive /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/disk.config because it was imported into RBD.#033[00m
Jan 20 09:41:08 np0005588920 kernel: tapb798b69c-a6: entered promiscuous mode
Jan 20 09:41:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:08Z|00243|binding|INFO|Claiming lport b798b69c-a652-408f-810b-0a1d3d9e324c for this chassis.
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.2278] manager: (tapb798b69c-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Jan 20 09:41:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:08Z|00244|binding|INFO|b798b69c-a652-408f-810b-0a1d3d9e324c: Claiming fa:16:3e:1f:5b:64 10.100.0.10
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.240 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:5b:64 10.100.0.10'], port_security=['fa:16:3e:1f:5b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6e4afbc3-37b1-4657-b152-91645facfcca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8c509d8e23246e1a509bf2197b73ebf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2942ac75-5271-4ea9-ab7b-3d6b61584672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf43ed79-9a6d-411c-b553-d9c9674f1bcb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b798b69c-a652-408f-810b-0a1d3d9e324c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.242 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b798b69c-a652-408f-810b-0a1d3d9e324c in datapath ed088f03-9b1a-4d56-98b6-03d264f312c6 bound to our chassis#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.243 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed088f03-9b1a-4d56-98b6-03d264f312c6#033[00m
Jan 20 09:41:08 np0005588920 systemd-udevd[251135]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.259 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1389a2-ad3b-48e5-9051-8a17486ab8dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.260 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped088f03-91 in ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.262 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped088f03-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.263 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e0f746-f61f-480e-9bf5-1a9c7246c2d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.264 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2392fc2-61d0-4fcd-88e8-7bc72401bf96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 systemd-machined[196121]: New machine qemu-28-instance-00000046.
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.2707] device (tapb798b69c-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.2717] device (tapb798b69c-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.279 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b1bc12-5cb3-483b-bf93-7ecf13785c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 systemd[1]: Started Virtual Machine qemu-28-instance-00000046.
Jan 20 09:41:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:08Z|00245|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c ovn-installed in OVS
Jan 20 09:41:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:08Z|00246|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c up in Southbound
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.304 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4f947a-9314-4b5d-96fb-dee395048420]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.333 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8f2659-12d2-42b0-8be7-ffcc0148142f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 systemd-udevd[251139]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.3387] manager: (taped088f03-90): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.338 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8030567f-6126-4e5f-973f-ce8b6ee28de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.366 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[13812dcf-e4dd-4e69-8206-7b5d21b9bcf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.369 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a3aa54-1b88-48f3-81f8-e628569c2822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.3908] device (taped088f03-90): carrier: link connected
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.396 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[72e47331-6bfc-493c-91f4-964af9ec3cd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.411 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[268f4ad4-61dc-4424-a05b-4afa7f41ce8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped088f03-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:85:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511007, 'reachable_time': 38546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251168, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.423 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e574cb2-f4f0-4ffb-af57-2cc50b130896]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:85d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511007, 'tstamp': 511007}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251169, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.437 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aebd6c43-43c7-413b-8b1e-c74163b897cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped088f03-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:85:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511007, 'reachable_time': 38546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251170, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.463 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[689c5f85-68de-4311-888c-1459887c94ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.516 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[550a1ba1-07ac-4797-a565-74270d7619ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.517 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped088f03-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.517 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.517 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped088f03-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 kernel: taped088f03-90: entered promiscuous mode
Jan 20 09:41:08 np0005588920 NetworkManager[49076]: <info>  [1768920068.5211] manager: (taped088f03-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.522 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped088f03-90, col_values=(('external_ids', {'iface-id': '5c384a16-e516-4384-a312-07ab9aa7b9a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.523 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:08Z|00247|binding|INFO|Releasing lport 5c384a16-e516-4384-a312-07ab9aa7b9a1 from this chassis (sb_readonly=0)
Jan 20 09:41:08 np0005588920 nova_compute[226886]: 2026-01-20 14:41:08.535 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.537 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.538 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f60084b4-4437-4cc9-a45f-b1688d020fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.539 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-ed088f03-9b1a-4d56-98b6-03d264f312c6
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID ed088f03-9b1a-4d56-98b6-03d264f312c6
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:41:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:08.539 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'env', 'PROCESS_TAG=haproxy-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed088f03-9b1a-4d56-98b6-03d264f312c6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:41:08 np0005588920 podman[251238]: 2026-01-20 14:41:08.854515236 +0000 UTC m=+0.020361319 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.068 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920069.0682542, 6e4afbc3-37b1-4657-b152-91645facfcca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.069 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Started (Lifecycle Event)#033[00m
Jan 20 09:41:09 np0005588920 podman[251238]: 2026-01-20 14:41:09.089280251 +0000 UTC m=+0.255126314 container create 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.091 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.095 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920069.0694551, 6e4afbc3-37b1-4657-b152-91645facfcca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.096 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:41:09 np0005588920 systemd[1]: Started libpod-conmon-0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389.scope.
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.134 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.137 226890 DEBUG nova.compute.manager [req-ab99d524-77e6-4d8a-a9de-97ac8e6d6428 req-6f25aa7c-9a4d-41d5-830d-4b7076fa29d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.138 226890 DEBUG oslo_concurrency.lockutils [req-ab99d524-77e6-4d8a-a9de-97ac8e6d6428 req-6f25aa7c-9a4d-41d5-830d-4b7076fa29d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.138 226890 DEBUG oslo_concurrency.lockutils [req-ab99d524-77e6-4d8a-a9de-97ac8e6d6428 req-6f25aa7c-9a4d-41d5-830d-4b7076fa29d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.139 226890 DEBUG oslo_concurrency.lockutils [req-ab99d524-77e6-4d8a-a9de-97ac8e6d6428 req-6f25aa7c-9a4d-41d5-830d-4b7076fa29d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.139 226890 DEBUG nova.compute.manager [req-ab99d524-77e6-4d8a-a9de-97ac8e6d6428 req-6f25aa7c-9a4d-41d5-830d-4b7076fa29d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Processing event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.140 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.143 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.146 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.149 226890 INFO nova.virt.libvirt.driver [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance spawned successfully.#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.150 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:41:09 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:41:09 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc493cba9e0974f17b1d3ca4592da6f6962979de4305250866f375dcf6123aa5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.168 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.169 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920069.1437902, 6e4afbc3-37b1-4657-b152-91645facfcca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.169 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:41:09 np0005588920 podman[251238]: 2026-01-20 14:41:09.172653225 +0000 UTC m=+0.338499288 container init 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:41:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:09.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.178 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.179 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 podman[251238]: 2026-01-20 14:41:09.180041661 +0000 UTC m=+0.345887724 container start 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.180 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.180 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.180 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.181 226890 DEBUG nova.virt.libvirt.driver [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.185 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.189 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:09 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [NOTICE]   (251264) : New worker (251266) forked
Jan 20 09:41:09 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [NOTICE]   (251264) : Loading success.
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.212 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.267 226890 INFO nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Took 7.42 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.268 226890 DEBUG nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.335 226890 INFO nova.compute.manager [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Took 8.36 seconds to build instance.#033[00m
Jan 20 09:41:09 np0005588920 nova_compute[226886]: 2026-01-20 14:41:09.359 226890 DEBUG oslo_concurrency.lockutils [None req-ceb3b326-cffe-45b7-9f17-0306dd1d12f8 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:41:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:09.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.110 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.861 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.862 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.863 226890 INFO nova.compute.manager [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Rebooting instance#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.880 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.880 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquired lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:10 np0005588920 nova_compute[226886]: 2026-01-20 14:41:10.881 226890 DEBUG nova.network.neutron [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:41:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:11.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:11.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.995 226890 DEBUG nova.compute.manager [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.996 226890 DEBUG oslo_concurrency.lockutils [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.996 226890 DEBUG oslo_concurrency.lockutils [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.996 226890 DEBUG oslo_concurrency.lockutils [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.997 226890 DEBUG nova.compute.manager [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:11 np0005588920 nova_compute[226886]: 2026-01-20 14:41:11.997 226890 WARNING nova.compute.manager [req-c9438bf8-7af3-41c3-8473-cf9748168ef3 req-46153f77-60ed-40bf-929f-4892ad20d242 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state rebooting_hard.#033[00m
Jan 20 09:41:12 np0005588920 nova_compute[226886]: 2026-01-20 14:41:12.027 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.097 226890 DEBUG nova.network.neutron [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updating instance_info_cache with network_info: [{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.126 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Releasing lock "refresh_cache-6e4afbc3-37b1-4657-b152-91645facfcca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.128 226890 DEBUG nova.compute.manager [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:13.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:13 np0005588920 kernel: tapb798b69c-a6 (unregistering): left promiscuous mode
Jan 20 09:41:13 np0005588920 NetworkManager[49076]: <info>  [1768920073.3136] device (tapb798b69c-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:41:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:13Z|00248|binding|INFO|Releasing lport b798b69c-a652-408f-810b-0a1d3d9e324c from this chassis (sb_readonly=0)
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.319 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:13Z|00249|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c down in Southbound
Jan 20 09:41:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:13Z|00250|binding|INFO|Removing iface tapb798b69c-a6 ovn-installed in OVS
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:13.327 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:5b:64 10.100.0.10'], port_security=['fa:16:3e:1f:5b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6e4afbc3-37b1-4657-b152-91645facfcca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8c509d8e23246e1a509bf2197b73ebf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2942ac75-5271-4ea9-ab7b-3d6b61584672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf43ed79-9a6d-411c-b553-d9c9674f1bcb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b798b69c-a652-408f-810b-0a1d3d9e324c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:13.329 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b798b69c-a652-408f-810b-0a1d3d9e324c in datapath ed088f03-9b1a-4d56-98b6-03d264f312c6 unbound from our chassis#033[00m
Jan 20 09:41:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:13.330 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed088f03-9b1a-4d56-98b6-03d264f312c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:41:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:13.331 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[858b7123-0b70-4e54-a318-8f05bc0f2448]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:13.332 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 namespace which is not needed anymore#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 20 09:41:13 np0005588920 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000046.scope: Consumed 4.815s CPU time.
Jan 20 09:41:13 np0005588920 systemd-machined[196121]: Machine qemu-28-instance-00000046 terminated.
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.467 226890 INFO nova.virt.libvirt.driver [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance destroyed successfully.#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.467 226890 DEBUG nova.objects.instance [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'resources' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.487 226890 DEBUG nova.virt.libvirt.vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:13Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.488 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.489 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.489 226890 DEBUG os_vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.491 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.491 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb798b69c-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.496 226890 INFO os_vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6')#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.502 226890 DEBUG nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start _get_guest_xml network_info=[{"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.505 226890 WARNING nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.513 226890 DEBUG nova.virt.libvirt.host [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.513 226890 DEBUG nova.virt.libvirt.host [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.519 226890 DEBUG nova.virt.libvirt.host [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.520 226890 DEBUG nova.virt.libvirt.host [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.521 226890 DEBUG nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.521 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.522 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.522 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.522 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.522 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.523 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.523 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.523 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.523 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.523 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.524 226890 DEBUG nova.virt.hardware [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.524 226890 DEBUG nova.objects.instance [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.571 226890 DEBUG oslo_concurrency.processutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.733 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.734 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.763 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.851 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.851 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.858 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:41:13 np0005588920 nova_compute[226886]: 2026-01-20 14:41:13.859 226890 INFO nova.compute.claims [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:41:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:13.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:13 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [NOTICE]   (251264) : haproxy version is 2.8.14-c23fe91
Jan 20 09:41:13 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [NOTICE]   (251264) : path to executable is /usr/sbin/haproxy
Jan 20 09:41:13 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [WARNING]  (251264) : Exiting Master process...
Jan 20 09:41:13 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [ALERT]    (251264) : Current worker (251266) exited with code 143 (Terminated)
Jan 20 09:41:13 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251260]: [WARNING]  (251264) : All workers exited. Exiting... (0)
Jan 20 09:41:13 np0005588920 systemd[1]: libpod-0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389.scope: Deactivated successfully.
Jan 20 09:41:13 np0005588920 podman[251301]: 2026-01-20 14:41:13.926871073 +0000 UTC m=+0.506616724 container died 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:41:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3966429821' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.043 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.112 226890 DEBUG nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.113 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.113 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.114 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.114 226890 DEBUG nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.115 226890 WARNING nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.115 226890 DEBUG nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.116 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.116 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.117 226890 DEBUG oslo_concurrency.lockutils [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.117 226890 DEBUG nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.117 226890 WARNING nova.compute.manager [req-b4ef3b01-424e-4fc3-94ae-d72197fbcc0d req-f319cccc-5e2e-455c-926f-52723d0fc77e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.129 226890 DEBUG oslo_concurrency.processutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389-userdata-shm.mount: Deactivated successfully.
Jan 20 09:41:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-cc493cba9e0974f17b1d3ca4592da6f6962979de4305250866f375dcf6123aa5-merged.mount: Deactivated successfully.
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.496 226890 DEBUG oslo_concurrency.processutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:14 np0005588920 podman[251301]: 2026-01-20 14:41:14.51108765 +0000 UTC m=+1.090833321 container cleanup 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 09:41:14 np0005588920 systemd[1]: libpod-conmon-0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389.scope: Deactivated successfully.
Jan 20 09:41:14 np0005588920 podman[251402]: 2026-01-20 14:41:14.574080046 +0000 UTC m=+0.039436730 container remove 0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.580 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f794e5-be70-4ebe-acd0-bd6f001245ec]: (4, ('Tue Jan 20 02:41:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 (0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389)\n0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389\nTue Jan 20 02:41:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 (0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389)\n0149688c39d4502e2c6136b686f87160b1a639e3a7e3b2e69822f1f0a15e5389\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.583 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[33b60ce7-e95c-4a17-acd1-0c3c8aae811c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.585 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped088f03-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:14 np0005588920 kernel: taped088f03-90: left promiscuous mode
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.601 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.606 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[03ac90c3-1b01-492b-8445-afb37c4c2d53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.623 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc9fb55-ccdd-44fe-801f-c8262ac00a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.624 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a69d7f-0fec-4faf-b71e-2d3e7d8ebc78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.642 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[00adb9d8-7379-4b59-b817-d7fdd9866ec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511001, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251418, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2ded088f03\x2d9b1a\x2d4d56\x2d98b6\x2d03d264f312c6.mount: Deactivated successfully.
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.646 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:41:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:14.647 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[2629fe50-853d-477d-ac10-bbf2a872feb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2521530103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.803 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.760s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.809 226890 DEBUG nova.compute.provider_tree [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.826 226890 DEBUG nova.scheduler.client.report [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.848 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.849 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.914 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.914 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.932 226890 INFO nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:41:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2225346245' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.948 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.963 226890 DEBUG oslo_concurrency.processutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.964 226890 DEBUG nova.virt.libvirt.vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:13Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.964 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.965 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.966 226890 DEBUG nova.objects.instance [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.989 226890 DEBUG nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <uuid>6e4afbc3-37b1-4657-b152-91645facfcca</uuid>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <name>instance-00000046</name>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:name>tempest-InstanceActionsTestJSON-server-201944342</nova:name>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:41:13</nova:creationTime>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:user uuid="7e3fb126d8254300b5f6f408fceefb19">tempest-InstanceActionsTestJSON-1406975575-project-member</nova:user>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:project uuid="c8c509d8e23246e1a509bf2197b73ebf">tempest-InstanceActionsTestJSON-1406975575</nova:project>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <nova:port uuid="b798b69c-a652-408f-810b-0a1d3d9e324c">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="serial">6e4afbc3-37b1-4657-b152-91645facfcca</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="uuid">6e4afbc3-37b1-4657-b152-91645facfcca</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6e4afbc3-37b1-4657-b152-91645facfcca_disk">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6e4afbc3-37b1-4657-b152-91645facfcca_disk.config">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1f:5b:64"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <target dev="tapb798b69c-a6"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca/console.log" append="off"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:41:14 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:41:14 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:41:14 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:41:14 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.991 226890 DEBUG nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.992 226890 DEBUG nova.virt.libvirt.driver [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] skipping disk for instance-00000046 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.994 226890 DEBUG nova.virt.libvirt.vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:13Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.994 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.996 226890 DEBUG nova.network.os_vif_util [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.996 226890 DEBUG os_vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:41:14 np0005588920 nova_compute[226886]: 2026-01-20 14:41:14.999 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.000 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.001 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.010 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.011 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb798b69c-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.012 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb798b69c-a6, col_values=(('external_ids', {'iface-id': 'b798b69c-a652-408f-810b-0a1d3d9e324c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:5b:64', 'vm-uuid': '6e4afbc3-37b1-4657-b152-91645facfcca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.0155] manager: (tapb798b69c-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.019 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.021 226890 INFO os_vif [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6')#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.051 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.053 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.053 226890 INFO nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Creating image(s)#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.081 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.117 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.142 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.145 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.162 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.167 226890 DEBUG nova.policy [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f68701c3f984f11981d5e1ddaa6f093', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '646f2240b79c44f08af243493552cf0e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:41:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:15.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.202 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.204 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.205 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.206 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.239 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.244 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:15 np0005588920 kernel: tapb798b69c-a6: entered promiscuous mode
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.3209] manager: (tapb798b69c-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Jan 20 09:41:15 np0005588920 systemd-udevd[251282]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:41:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:15Z|00251|binding|INFO|Claiming lport b798b69c-a652-408f-810b-0a1d3d9e324c for this chassis.
Jan 20 09:41:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:15Z|00252|binding|INFO|b798b69c-a652-408f-810b-0a1d3d9e324c: Claiming fa:16:3e:1f:5b:64 10.100.0.10
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.326 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.332 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:5b:64 10.100.0.10'], port_security=['fa:16:3e:1f:5b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6e4afbc3-37b1-4657-b152-91645facfcca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8c509d8e23246e1a509bf2197b73ebf', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2942ac75-5271-4ea9-ab7b-3d6b61584672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf43ed79-9a6d-411c-b553-d9c9674f1bcb, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b798b69c-a652-408f-810b-0a1d3d9e324c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.334 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b798b69c-a652-408f-810b-0a1d3d9e324c in datapath ed088f03-9b1a-4d56-98b6-03d264f312c6 bound to our chassis#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.335 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed088f03-9b1a-4d56-98b6-03d264f312c6#033[00m
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.3379] device (tapb798b69c-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.3385] device (tapb798b69c-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:41:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:15Z|00253|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c ovn-installed in OVS
Jan 20 09:41:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:15Z|00254|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c up in Southbound
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.347 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[90152a7d-192a-4230-90ff-fe0e73da008f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.348 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped088f03-91 in ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.350 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped088f03-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.350 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e18e3eed-e3a3-46f4-ae6c-bf60736c8789]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.351 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.352 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1dd5b4-e9ec-4b36-985a-5d156197364e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 systemd-machined[196121]: New machine qemu-29-instance-00000046.
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.361 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[e50ab906-e9f1-44a6-9f4f-1b5b2fe6b588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 systemd[1]: Started Virtual Machine qemu-29-instance-00000046.
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.373 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a953c4-61cc-449d-ad0c-1059e4eb2469]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.396 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ee54256c-2fef-475f-8c1e-9cbec81022d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.4028] manager: (taped088f03-90): new Veth device (/org/freedesktop/NetworkManager/Devices/134)
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.402 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc7f887-9908-4248-84cd-11a5f07d49b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.432 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7f673f-fbec-417f-876d-759556235318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.435 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[da399856-f572-4a9c-b187-08ac7edde644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.4593] device (taped088f03-90): carrier: link connected
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.461 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[56879f2b-b004-4c64-b47e-2509149222df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.476 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb0f1fd-8f5e-4d14-ba9b-56580140e45f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped088f03-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:85:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511714, 'reachable_time': 30328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251577, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.490 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a7ecdb-3fe3-4bb6-b083-b43a071d906e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:85d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511714, 'tstamp': 511714}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251578, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.508 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e89ffb01-ba34-40ef-83f1-f2866fd2f9dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped088f03-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:85:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511714, 'reachable_time': 30328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251579, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.535 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[847c6747-9242-43c3-973e-10584b48c940]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.588 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb05cb5-946e-4e93-bc63-194730227a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.590 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped088f03-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.590 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.590 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped088f03-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.591 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 NetworkManager[49076]: <info>  [1768920075.5925] manager: (taped088f03-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 20 09:41:15 np0005588920 kernel: taped088f03-90: entered promiscuous mode
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.596 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.599 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped088f03-90, col_values=(('external_ids', {'iface-id': '5c384a16-e516-4384-a312-07ab9aa7b9a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.600 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:15Z|00255|binding|INFO|Releasing lport 5c384a16-e516-4384-a312-07ab9aa7b9a1 from this chassis (sb_readonly=0)
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.602 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.603 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.605 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[380622b7-84d9-4df6-920c-d536bfb95284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.605 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-ed088f03-9b1a-4d56-98b6-03d264f312c6
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/ed088f03-9b1a-4d56-98b6-03d264f312c6.pid.haproxy
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID ed088f03-9b1a-4d56-98b6-03d264f312c6
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:41:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:15.606 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'env', 'PROCESS_TAG=haproxy-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed088f03-9b1a-4d56-98b6-03d264f312c6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.615 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.799 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 6e4afbc3-37b1-4657-b152-91645facfcca due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.800 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920075.79928, 6e4afbc3-37b1-4657-b152-91645facfcca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.800 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.811 226890 DEBUG nova.compute.manager [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.814 226890 INFO nova.virt.libvirt.driver [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance rebooted successfully.#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.815 226890 DEBUG nova.compute.manager [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.861 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.866 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.892 226890 DEBUG oslo_concurrency.lockutils [None req-fe9c99d5-6c58-439e-8d97-87b5141807a9 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:15.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.895 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920075.8116062, 6e4afbc3-37b1-4657-b152-91645facfcca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.896 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Started (Lifecycle Event)#033[00m
Jan 20 09:41:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.918 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:15 np0005588920 nova_compute[226886]: 2026-01-20 14:41:15.923 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.006 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Successfully created port: 4de40e08-0741-4eb2-99b6-d4cc48958b1e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:41:16 np0005588920 podman[251656]: 2026-01-20 14:41:15.97924897 +0000 UTC m=+0.025901833 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:41:16 np0005588920 podman[251656]: 2026-01-20 14:41:16.088084825 +0000 UTC m=+0.134737678 container create 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.123 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.879s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:16 np0005588920 systemd[1]: Started libpod-conmon-2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60.scope.
Jan 20 09:41:16 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:41:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbee24f9c339fe0f3f8236ebd7adad325d405973c449e877ad5e5e6148aebbdf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.200 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] resizing rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:41:16 np0005588920 podman[251656]: 2026-01-20 14:41:16.405891345 +0000 UTC m=+0.452544238 container init 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:41:16 np0005588920 podman[251656]: 2026-01-20 14:41:16.411566143 +0000 UTC m=+0.458218996 container start 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:41:16 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [NOTICE]   (251730) : New worker (251732) forked
Jan 20 09:41:16 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [NOTICE]   (251730) : Loading success.
Jan 20 09:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:16.444 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:16.445 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:16.445 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.584 226890 DEBUG nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.585 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.585 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.586 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.586 226890 DEBUG nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.586 226890 WARNING nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state None.#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.586 226890 DEBUG nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.587 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.587 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.587 226890 DEBUG oslo_concurrency.lockutils [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.588 226890 DEBUG nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.588 226890 WARNING nova.compute.manager [req-78c849df-6664-4577-a589-dd8b9df98a25 req-d15d574b-d2d2-46ab-9ae6-e398be01b8ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state None.#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.655 226890 DEBUG nova.objects.instance [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lazy-loading 'migration_context' on Instance uuid 8de70a62-d30c-4aa1-90fd-d5f6c551d606 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.680 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.680 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Ensure instance console log exists: /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.680 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.681 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.681 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.885 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Successfully updated port: 4de40e08-0741-4eb2-99b6-d4cc48958b1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.977 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.977 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquired lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:16 np0005588920 nova_compute[226886]: 2026-01-20 14:41:16.978 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.121 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:41:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:17.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.226 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.227 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.229 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.230 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.230 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.231 226890 INFO nova.compute.manager [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Terminating instance#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.233 226890 DEBUG nova.compute.manager [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:41:17 np0005588920 kernel: tapb798b69c-a6 (unregistering): left promiscuous mode
Jan 20 09:41:17 np0005588920 NetworkManager[49076]: <info>  [1768920077.5786] device (tapb798b69c-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.592 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:17Z|00256|binding|INFO|Releasing lport b798b69c-a652-408f-810b-0a1d3d9e324c from this chassis (sb_readonly=0)
Jan 20 09:41:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:17Z|00257|binding|INFO|Setting lport b798b69c-a652-408f-810b-0a1d3d9e324c down in Southbound
Jan 20 09:41:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:17Z|00258|binding|INFO|Removing iface tapb798b69c-a6 ovn-installed in OVS
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:17.617 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:5b:64 10.100.0.10'], port_security=['fa:16:3e:1f:5b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6e4afbc3-37b1-4657-b152-91645facfcca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c8c509d8e23246e1a509bf2197b73ebf', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2942ac75-5271-4ea9-ab7b-3d6b61584672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf43ed79-9a6d-411c-b553-d9c9674f1bcb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b798b69c-a652-408f-810b-0a1d3d9e324c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:17.619 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b798b69c-a652-408f-810b-0a1d3d9e324c in datapath ed088f03-9b1a-4d56-98b6-03d264f312c6 unbound from our chassis#033[00m
Jan 20 09:41:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:17.622 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed088f03-9b1a-4d56-98b6-03d264f312c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:41:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:17.623 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[de7daca1-cff3-4286-90fd-300b5e62a975]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:17.624 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 namespace which is not needed anymore#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.628 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:17 np0005588920 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 20 09:41:17 np0005588920 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000046.scope: Consumed 1.912s CPU time.
Jan 20 09:41:17 np0005588920 systemd-machined[196121]: Machine qemu-29-instance-00000046 terminated.
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.869 226890 INFO nova.virt.libvirt.driver [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Instance destroyed successfully.#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.870 226890 DEBUG nova.objects.instance [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lazy-loading 'resources' on Instance uuid 6e4afbc3-37b1-4657-b152-91645facfcca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.886 226890 DEBUG nova.virt.libvirt.vif [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:40:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-201944342',display_name='tempest-InstanceActionsTestJSON-server-201944342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-201944342',id=70,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c8c509d8e23246e1a509bf2197b73ebf',ramdisk_id='',reservation_id='r-vq75u88a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1406975575',owner_user_name='tempest-InstanceActionsTestJSON-1406975575-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:15Z,user_data=None,user_id='7e3fb126d8254300b5f6f408fceefb19',uuid=6e4afbc3-37b1-4657-b152-91645facfcca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.886 226890 DEBUG nova.network.os_vif_util [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converting VIF {"id": "b798b69c-a652-408f-810b-0a1d3d9e324c", "address": "fa:16:3e:1f:5b:64", "network": {"id": "ed088f03-9b1a-4d56-98b6-03d264f312c6", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1860463566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c8c509d8e23246e1a509bf2197b73ebf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb798b69c-a6", "ovs_interfaceid": "b798b69c-a652-408f-810b-0a1d3d9e324c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.887 226890 DEBUG nova.network.os_vif_util [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.887 226890 DEBUG os_vif [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.889 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb798b69c-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.890 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:17 np0005588920 nova_compute[226886]: 2026-01-20 14:41:17.894 226890 INFO os_vif [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:5b:64,bridge_name='br-int',has_traffic_filtering=True,id=b798b69c-a652-408f-810b-0a1d3d9e324c,network=Network(ed088f03-9b1a-4d56-98b6-03d264f312c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb798b69c-a6')#033[00m
Jan 20 09:41:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:17.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:17 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [NOTICE]   (251730) : haproxy version is 2.8.14-c23fe91
Jan 20 09:41:17 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [NOTICE]   (251730) : path to executable is /usr/sbin/haproxy
Jan 20 09:41:17 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [WARNING]  (251730) : Exiting Master process...
Jan 20 09:41:17 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [ALERT]    (251730) : Current worker (251732) exited with code 143 (Terminated)
Jan 20 09:41:17 np0005588920 neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6[251679]: [WARNING]  (251730) : All workers exited. Exiting... (0)
Jan 20 09:41:17 np0005588920 systemd[1]: libpod-2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60.scope: Deactivated successfully.
Jan 20 09:41:17 np0005588920 conmon[251679]: conmon 2cc5910d2d5816202a95 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60.scope/container/memory.events
Jan 20 09:41:17 np0005588920 podman[251784]: 2026-01-20 14:41:17.917479055 +0000 UTC m=+0.205769988 container died 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:41:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60-userdata-shm.mount: Deactivated successfully.
Jan 20 09:41:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay-bbee24f9c339fe0f3f8236ebd7adad325d405973c449e877ad5e5e6148aebbdf-merged.mount: Deactivated successfully.
Jan 20 09:41:17 np0005588920 podman[251784]: 2026-01-20 14:41:17.969710761 +0000 UTC m=+0.258001694 container cleanup 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:41:17 np0005588920 systemd[1]: libpod-conmon-2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60.scope: Deactivated successfully.
Jan 20 09:41:18 np0005588920 podman[251845]: 2026-01-20 14:41:18.035073833 +0000 UTC m=+0.042560658 container remove 2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.041 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f51e5e84-2909-4080-8eea-c96ab834e1fe]: (4, ('Tue Jan 20 02:41:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 (2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60)\n2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60\nTue Jan 20 02:41:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 (2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60)\n2cc5910d2d5816202a958f12856eff52383829b4991cf81b0e2f2a4eb1da4d60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.042 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[885584e5-a953-48f3-91f4-3553dc145669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.043 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped088f03-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588920 kernel: taped088f03-90: left promiscuous mode
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.049 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[80646228-7ce7-437c-b395-2820a2431394]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.066 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d5776f-d6d4-4ee3-b446-c37b77248351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.067 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6efed2-fea1-4dc2-9d5f-1cd70bd9a2ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.083 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e53ea93-6ec7-4652-830d-67b53e5645ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511707, 'reachable_time': 41275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251860, 'error': None, 'target': 'ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.086 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed088f03-9b1a-4d56-98b6-03d264f312c6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:41:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:18.086 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[06ee691a-52a7-47e1-ab7b-bb0863287bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:18 np0005588920 systemd[1]: run-netns-ovnmeta\x2ded088f03\x2d9b1a\x2d4d56\x2d98b6\x2d03d264f312c6.mount: Deactivated successfully.
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.141 226890 DEBUG nova.network.neutron [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Updating instance_info_cache with network_info: [{"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.209 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Releasing lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.209 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Instance network_info: |[{"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.211 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Start _get_guest_xml network_info=[{"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.215 226890 WARNING nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.220 226890 DEBUG nova.virt.libvirt.host [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.220 226890 DEBUG nova.virt.libvirt.host [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.224 226890 DEBUG nova.virt.libvirt.host [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.225 226890 DEBUG nova.virt.libvirt.host [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.226 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.226 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.226 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.226 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.226 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.227 226890 DEBUG nova.virt.hardware [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.230 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.336 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-changed-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.337 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Refreshing instance network info cache due to event network-changed-4de40e08-0741-4eb2-99b6-d4cc48958b1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.337 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.337 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.337 226890 DEBUG nova.network.neutron [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Refreshing network info cache for port 4de40e08-0741-4eb2-99b6-d4cc48958b1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.472 226890 INFO nova.virt.libvirt.driver [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Deleting instance files /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca_del#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.473 226890 INFO nova.virt.libvirt.driver [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Deletion of /var/lib/nova/instances/6e4afbc3-37b1-4657-b152-91645facfcca_del complete#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.528 226890 INFO nova.compute.manager [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Took 1.29 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.528 226890 DEBUG oslo.service.loopingcall [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.529 226890 DEBUG nova.compute.manager [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.529 226890 DEBUG nova.network.neutron [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:41:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2318344812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.653 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.677 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:18 np0005588920 nova_compute[226886]: 2026-01-20 14:41:18.680 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/906151994' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:19.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.309 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.629s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.311 226890 DEBUG nova.virt.libvirt.vif [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-280212278',display_name='tempest-InstanceActionsNegativeTestJSON-server-280212278',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-280212278',id=72,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='646f2240b79c44f08af243493552cf0e',ramdisk_id='',reservation_id='r-auq5vcve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1932198437',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1932198437-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:14Z,user_data=None,user_id='6f68701c3f984f11981d5e1ddaa6f093',uuid=8de70a62-d30c-4aa1-90fd-d5f6c551d606,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.311 226890 DEBUG nova.network.os_vif_util [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converting VIF {"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.312 226890 DEBUG nova.network.os_vif_util [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.313 226890 DEBUG nova.objects.instance [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lazy-loading 'pci_devices' on Instance uuid 8de70a62-d30c-4aa1-90fd-d5f6c551d606 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.448 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <uuid>8de70a62-d30c-4aa1-90fd-d5f6c551d606</uuid>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <name>instance-00000048</name>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:name>tempest-InstanceActionsNegativeTestJSON-server-280212278</nova:name>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:41:18</nova:creationTime>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:user uuid="6f68701c3f984f11981d5e1ddaa6f093">tempest-InstanceActionsNegativeTestJSON-1932198437-project-member</nova:user>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:project uuid="646f2240b79c44f08af243493552cf0e">tempest-InstanceActionsNegativeTestJSON-1932198437</nova:project>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <nova:port uuid="4de40e08-0741-4eb2-99b6-d4cc48958b1e">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="serial">8de70a62-d30c-4aa1-90fd-d5f6c551d606</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="uuid">8de70a62-d30c-4aa1-90fd-d5f6c551d606</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:74:90:ac"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <target dev="tap4de40e08-07"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/console.log" append="off"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:41:19 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:41:19 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:41:19 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:41:19 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.450 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Preparing to wait for external event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.451 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.451 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.452 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.453 226890 DEBUG nova.virt.libvirt.vif [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-280212278',display_name='tempest-InstanceActionsNegativeTestJSON-server-280212278',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-280212278',id=72,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='646f2240b79c44f08af243493552cf0e',ramdisk_id='',reservation_id='r-auq5vcve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1932198437',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1932198437-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:14Z,user_data=None,user_id='6f68701c3f984f11981d5e1ddaa6f093',uuid=8de70a62-d30c-4aa1-90fd-d5f6c551d606,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.454 226890 DEBUG nova.network.os_vif_util [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converting VIF {"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.455 226890 DEBUG nova.network.os_vif_util [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.455 226890 DEBUG os_vif [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.457 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.458 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.467 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4de40e08-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.468 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4de40e08-07, col_values=(('external_ids', {'iface-id': '4de40e08-0741-4eb2-99b6-d4cc48958b1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:90:ac', 'vm-uuid': '8de70a62-d30c-4aa1-90fd-d5f6c551d606'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.469 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:19 np0005588920 NetworkManager[49076]: <info>  [1768920079.4703] manager: (tap4de40e08-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.474 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.475 226890 INFO os_vif [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07')#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.535 226890 DEBUG nova.network.neutron [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.770 226890 DEBUG nova.compute.manager [req-146f3d82-5d64-4797-b154-269ebf6d75a4 req-03bd1aaa-496b-4900-a048-b23c982fefea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-deleted-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.770 226890 INFO nova.compute.manager [req-146f3d82-5d64-4797-b154-269ebf6d75a4 req-03bd1aaa-496b-4900-a048-b23c982fefea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Neutron deleted interface b798b69c-a652-408f-810b-0a1d3d9e324c; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.771 226890 DEBUG nova.network.neutron [req-146f3d82-5d64-4797-b154-269ebf6d75a4 req-03bd1aaa-496b-4900-a048-b23c982fefea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.775 226890 INFO nova.compute.manager [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.790 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.791 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.791 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] No VIF found with MAC fa:16:3e:74:90:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.793 226890 INFO nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Using config drive#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.822 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.833 226890 DEBUG nova.compute.manager [req-146f3d82-5d64-4797-b154-269ebf6d75a4 req-03bd1aaa-496b-4900-a048-b23c982fefea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Detach interface failed, port_id=b798b69c-a652-408f-810b-0a1d3d9e324c, reason: Instance 6e4afbc3-37b1-4657-b152-91645facfcca could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.849 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.850 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:19.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:19 np0005588920 nova_compute[226886]: 2026-01-20 14:41:19.917 226890 DEBUG oslo_concurrency.processutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.000 226890 DEBUG nova.network.neutron [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Updated VIF entry in instance network info cache for port 4de40e08-0741-4eb2-99b6-d4cc48958b1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.001 226890 DEBUG nova.network.neutron [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Updating instance_info_cache with network_info: [{"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.019 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-8de70a62-d30c-4aa1-90fd-d5f6c551d606" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.020 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.021 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.022 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.022 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.023 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.024 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-unplugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.025 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.025 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.025 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.026 226890 DEBUG oslo_concurrency.lockutils [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.026 226890 DEBUG nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] No waiting events found dispatching network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.026 226890 WARNING nova.compute.manager [req-1d30f8ad-9cd9-4c87-8a02-c127c2a67fb1 req-353debe9-08f3-4c84-b766-581dcde63b1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Received unexpected event network-vif-plugged-b798b69c-a652-408f-810b-0a1d3d9e324c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.187 226890 INFO nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Creating config drive at /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.191 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbif7adtq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.318 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbif7adtq" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1467856985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.392 226890 DEBUG nova.storage.rbd_utils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] rbd image 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.395 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.417 226890 DEBUG oslo_concurrency.processutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.423 226890 DEBUG nova.compute.provider_tree [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.439 226890 DEBUG nova.scheduler.client.report [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.460 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.488 226890 INFO nova.scheduler.client.report [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Deleted allocations for instance 6e4afbc3-37b1-4657-b152-91645facfcca#033[00m
Jan 20 09:41:20 np0005588920 nova_compute[226886]: 2026-01-20 14:41:20.602 226890 DEBUG oslo_concurrency.lockutils [None req-d23c7d82-fda4-40e0-855f-d5476c9cef09 7e3fb126d8254300b5f6f408fceefb19 c8c509d8e23246e1a509bf2197b73ebf - - default default] Lock "6e4afbc3-37b1-4657-b152-91645facfcca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:21.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:21.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.209 226890 DEBUG oslo_concurrency.processutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config 8de70a62-d30c-4aa1-90fd-d5f6c551d606_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.814s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.210 226890 INFO nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Deleting local config drive /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606/disk.config because it was imported into RBD.#033[00m
Jan 20 09:41:22 np0005588920 kernel: tap4de40e08-07: entered promiscuous mode
Jan 20 09:41:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:22Z|00259|binding|INFO|Claiming lport 4de40e08-0741-4eb2-99b6-d4cc48958b1e for this chassis.
Jan 20 09:41:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:22Z|00260|binding|INFO|4de40e08-0741-4eb2-99b6-d4cc48958b1e: Claiming fa:16:3e:74:90:ac 10.100.0.4
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.2623] manager: (tap4de40e08-07): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.271 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:90:ac 10.100.0.4'], port_security=['fa:16:3e:74:90:ac 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8de70a62-d30c-4aa1-90fd-d5f6c551d606', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '646f2240b79c44f08af243493552cf0e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5b99083-cd50-404f-b6b6-90744d9bf38d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=899eeb21-8219-43a7-a868-b1fc58a40bef, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de40e08-0741-4eb2-99b6-d4cc48958b1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.272 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de40e08-0741-4eb2-99b6-d4cc48958b1e in datapath fb86544d-3856-43a6-b7ec-b1e8666198f0 bound to our chassis#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.274 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb86544d-3856-43a6-b7ec-b1e8666198f0#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.286 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4cce84ac-6ceb-49e8-b059-920a3e84e96d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.287 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb86544d-31 in ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.289 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb86544d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.289 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[20ffc27f-b7f5-49c0-8889-9db44a577c32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.290 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[38ec43a0-814b-48af-b2eb-baee10637c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 systemd-udevd[252020]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:41:22 np0005588920 systemd-machined[196121]: New machine qemu-30-instance-00000048.
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.3049] device (tap4de40e08-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.3056] device (tap4de40e08-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.305 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[ef971f3e-4c8a-4bc5-9de2-e09b86ce494f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 systemd[1]: Started Virtual Machine qemu-30-instance-00000048.
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.330 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.332 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6d5197-379c-4568-bd0f-a7b03769da72]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:22Z|00261|binding|INFO|Setting lport 4de40e08-0741-4eb2-99b6-d4cc48958b1e ovn-installed in OVS
Jan 20 09:41:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:22Z|00262|binding|INFO|Setting lport 4de40e08-0741-4eb2-99b6-d4cc48958b1e up in Southbound
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.339 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.357 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[12f7f9f9-4864-4e83-a161-31d3f18110e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.3625] manager: (tapfb86544d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/138)
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.362 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2c03bf-6337-41d6-a6e0-2629aca421e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.393 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc8772d-797d-4e9a-9952-ad77d198470f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.395 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2a44ef43-6013-4aa0-a8cd-c1e0f1513149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.4156] device (tapfb86544d-30): carrier: link connected
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.422 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0889ffa4-8b49-438d-85fe-14b8b8c8ad3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.436 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe2bbc4-c022-442e-b7b1-e84c30d04073]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb86544d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:bc:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512410, 'reachable_time': 26643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252053, 'error': None, 'target': 'ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.451 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3e73ef-2d4e-45c8-a658-842541509af2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:bc5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512410, 'tstamp': 512410}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252054, 'error': None, 'target': 'ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.467 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b9776d-9000-4bf1-9177-ce55d1ced94e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb86544d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:bc:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512410, 'reachable_time': 26643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252055, 'error': None, 'target': 'ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.494 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1c919e-63f4-49b7-9784-b24bef6e64ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.544 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d4f311-cb73-498d-9e6e-29acc7a4ded3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.545 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb86544d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.546 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.546 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb86544d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.548 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 NetworkManager[49076]: <info>  [1768920082.5487] manager: (tapfb86544d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 20 09:41:22 np0005588920 kernel: tapfb86544d-30: entered promiscuous mode
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.550 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb86544d-30, col_values=(('external_ids', {'iface-id': '9b7401a0-b683-4ef3-8490-61cd698e6e17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:22Z|00263|binding|INFO|Releasing lport 9b7401a0-b683-4ef3-8490-61cd698e6e17 from this chassis (sb_readonly=0)
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.551 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 nova_compute[226886]: 2026-01-20 14:41:22.565 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.566 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb86544d-3856-43a6-b7ec-b1e8666198f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb86544d-3856-43a6-b7ec-b1e8666198f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.566 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[88cd23b0-6325-47e9-99ac-a585a397da61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.567 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fb86544d-3856-43a6-b7ec-b1e8666198f0
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fb86544d-3856-43a6-b7ec-b1e8666198f0.pid.haproxy
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fb86544d-3856-43a6-b7ec-b1e8666198f0
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:41:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:22.568 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'env', 'PROCESS_TAG=haproxy-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb86544d-3856-43a6-b7ec-b1e8666198f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:41:23 np0005588920 podman[252087]: 2026-01-20 14:41:22.906469829 +0000 UTC m=+0.026821059 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:41:23 np0005588920 podman[252087]: 2026-01-20 14:41:23.020871298 +0000 UTC m=+0.141222518 container create 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:41:23 np0005588920 systemd[1]: Started libpod-conmon-9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb.scope.
Jan 20 09:41:23 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:41:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a38e62ca8171a0489b31f5206768b9af9fe786796c2305d23c1611ed7a0668f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:41:23 np0005588920 podman[252087]: 2026-01-20 14:41:23.106800024 +0000 UTC m=+0.227151254 container init 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:41:23 np0005588920 podman[252087]: 2026-01-20 14:41:23.111795973 +0000 UTC m=+0.232147183 container start 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:41:23 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [NOTICE]   (252148) : New worker (252151) forked
Jan 20 09:41:23 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [NOTICE]   (252148) : Loading success.
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.155 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920083.1551564, 8de70a62-d30c-4aa1-90fd-d5f6c551d606 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.156 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] VM Started (Lifecycle Event)#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.183 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.188 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920083.1558352, 8de70a62-d30c-4aa1-90fd-d5f6c551d606 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.188 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:41:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:23.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.209 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.211 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:23 np0005588920 nova_compute[226886]: 2026-01-20 14:41:23.251 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:41:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:23.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.471 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.779 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:41:24 np0005588920 nova_compute[226886]: 2026-01-20 14:41:24.779 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:41:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:25Z|00264|binding|INFO|Releasing lport 9b7401a0-b683-4ef3-8490-61cd698e6e17 from this chassis (sb_readonly=0)
Jan 20 09:41:25 np0005588920 nova_compute[226886]: 2026-01-20 14:41:25.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:25 np0005588920 nova_compute[226886]: 2026-01-20 14:41:25.143 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:25.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:25.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:26 np0005588920 podman[252160]: 2026-01-20 14:41:26.016187262 +0000 UTC m=+0.098285691 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:41:26 np0005588920 nova_compute[226886]: 2026-01-20 14:41:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:27.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.459 226890 DEBUG nova.compute.manager [req-1dd0d503-5a1f-4306-8f6b-9684042507f5 req-eb28c5dc-ce96-40b0-bdf6-c64bcf277459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.459 226890 DEBUG oslo_concurrency.lockutils [req-1dd0d503-5a1f-4306-8f6b-9684042507f5 req-eb28c5dc-ce96-40b0-bdf6-c64bcf277459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.460 226890 DEBUG oslo_concurrency.lockutils [req-1dd0d503-5a1f-4306-8f6b-9684042507f5 req-eb28c5dc-ce96-40b0-bdf6-c64bcf277459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.460 226890 DEBUG oslo_concurrency.lockutils [req-1dd0d503-5a1f-4306-8f6b-9684042507f5 req-eb28c5dc-ce96-40b0-bdf6-c64bcf277459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.460 226890 DEBUG nova.compute.manager [req-1dd0d503-5a1f-4306-8f6b-9684042507f5 req-eb28c5dc-ce96-40b0-bdf6-c64bcf277459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Processing event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.461 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.463 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920087.4636545, 8de70a62-d30c-4aa1-90fd-d5f6c551d606 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.464 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.467 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.474 226890 INFO nova.virt.libvirt.driver [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Instance spawned successfully.#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.474 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.536 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.543 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.548 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.549 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.550 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.550 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.551 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.551 226890 DEBUG nova.virt.libvirt.driver [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.579 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.686 226890 INFO nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Took 12.63 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.686 226890 DEBUG nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.855 226890 INFO nova.compute.manager [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Took 14.04 seconds to build instance.#033[00m
Jan 20 09:41:27 np0005588920 nova_compute[226886]: 2026-01-20 14:41:27.887 226890 DEBUG oslo_concurrency.lockutils [None req-ec70d4d2-2f23-4f6e-8ec1-10f3dca500eb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:27.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:28 np0005588920 nova_compute[226886]: 2026-01-20 14:41:28.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:29.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:29 np0005588920 nova_compute[226886]: 2026-01-20 14:41:29.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:29.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.881 226890 DEBUG nova.compute.manager [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.881 226890 DEBUG oslo_concurrency.lockutils [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.881 226890 DEBUG oslo_concurrency.lockutils [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.882 226890 DEBUG oslo_concurrency.lockutils [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.882 226890 DEBUG nova.compute.manager [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] No waiting events found dispatching network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:30 np0005588920 nova_compute[226886]: 2026-01-20 14:41:30.882 226890 WARNING nova.compute.manager [req-638a82c3-21fc-4eb6-b8b2-837d087f38a2 req-67035687-88fe-411d-a04c-967b438866b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received unexpected event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e for instance with vm_state active and task_state None.#033[00m
Jan 20 09:41:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:31.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:41:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:31.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.868 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920077.8679354, 6e4afbc3-37b1-4657-b152-91645facfcca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:32 np0005588920 nova_compute[226886]: 2026-01-20 14:41:32.869 226890 INFO nova.compute.manager [-] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:41:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:33.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.783 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.784 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.784 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.784 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.784 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:33 np0005588920 nova_compute[226886]: 2026-01-20 14:41:33.819 226890 DEBUG nova.compute.manager [None req-88c6b8ca-039b-45ef-8e79-0d81d5072b19 - - - - - -] [instance: 6e4afbc3-37b1-4657-b152-91645facfcca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:33.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2759836805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:34 np0005588920 nova_compute[226886]: 2026-01-20 14:41:34.258 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:34 np0005588920 nova_compute[226886]: 2026-01-20 14:41:34.475 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:35 np0005588920 nova_compute[226886]: 2026-01-20 14:41:35.146 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:35.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:35.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.150 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.150 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:41:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:37.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.307 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.309 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4461MB free_disk=20.900901794433594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.309 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.309 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.583 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 8de70a62-d30c-4aa1-90fd-d5f6c551d606 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.584 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.584 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:41:37 np0005588920 nova_compute[226886]: 2026-01-20 14:41:37.630 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:37.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3010398542' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:38 np0005588920 nova_compute[226886]: 2026-01-20 14:41:38.077 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:38 np0005588920 nova_compute[226886]: 2026-01-20 14:41:38.082 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:38 np0005588920 nova_compute[226886]: 2026-01-20 14:41:38.280 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:38 np0005588920 nova_compute[226886]: 2026-01-20 14:41:38.313 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:41:38 np0005588920 nova_compute[226886]: 2026-01-20 14:41:38.314 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:39 np0005588920 podman[252256]: 2026-01-20 14:41:39.006369954 +0000 UTC m=+0.091150122 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:41:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:39.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:39 np0005588920 nova_compute[226886]: 2026-01-20 14:41:39.315 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:39 np0005588920 nova_compute[226886]: 2026-01-20 14:41:39.479 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:39Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:90:ac 10.100.0.4
Jan 20 09:41:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:39Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:90:ac 10.100.0.4
Jan 20 09:41:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:39.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:40 np0005588920 podman[252423]: 2026-01-20 14:41:40.023103759 +0000 UTC m=+0.518408654 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 09:41:40 np0005588920 nova_compute[226886]: 2026-01-20 14:41:40.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:40 np0005588920 podman[252423]: 2026-01-20 14:41:40.186079762 +0000 UTC m=+0.681384577 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:41:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:41.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:41.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.210 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.212 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.212 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.213 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.213 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.216 226890 INFO nova.compute.manager [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Terminating instance#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.218 226890 DEBUG nova.compute.manager [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:41:42 np0005588920 kernel: tap4de40e08-07 (unregistering): left promiscuous mode
Jan 20 09:41:42 np0005588920 NetworkManager[49076]: <info>  [1768920102.3611] device (tap4de40e08-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:42Z|00265|binding|INFO|Releasing lport 4de40e08-0741-4eb2-99b6-d4cc48958b1e from this chassis (sb_readonly=0)
Jan 20 09:41:42 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:42Z|00266|binding|INFO|Setting lport 4de40e08-0741-4eb2-99b6-d4cc48958b1e down in Southbound
Jan 20 09:41:42 np0005588920 ovn_controller[133971]: 2026-01-20T14:41:42Z|00267|binding|INFO|Removing iface tap4de40e08-07 ovn-installed in OVS
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 20 09:41:42 np0005588920 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Consumed 13.330s CPU time.
Jan 20 09:41:42 np0005588920 systemd-machined[196121]: Machine qemu-30-instance-00000048 terminated.
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.484 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:90:ac 10.100.0.4'], port_security=['fa:16:3e:74:90:ac 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8de70a62-d30c-4aa1-90fd-d5f6c551d606', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '646f2240b79c44f08af243493552cf0e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5b99083-cd50-404f-b6b6-90744d9bf38d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=899eeb21-8219-43a7-a868-b1fc58a40bef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de40e08-0741-4eb2-99b6-d4cc48958b1e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.485 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de40e08-0741-4eb2-99b6-d4cc48958b1e in datapath fb86544d-3856-43a6-b7ec-b1e8666198f0 unbound from our chassis#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.486 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb86544d-3856-43a6-b7ec-b1e8666198f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.487 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[87aea3af-077f-4c70-b7bf-6ccbfa7d4834]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.488 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0 namespace which is not needed anymore#033[00m
Jan 20 09:41:42 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [NOTICE]   (252148) : haproxy version is 2.8.14-c23fe91
Jan 20 09:41:42 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [NOTICE]   (252148) : path to executable is /usr/sbin/haproxy
Jan 20 09:41:42 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [WARNING]  (252148) : Exiting Master process...
Jan 20 09:41:42 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [ALERT]    (252148) : Current worker (252151) exited with code 143 (Terminated)
Jan 20 09:41:42 np0005588920 neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0[252141]: [WARNING]  (252148) : All workers exited. Exiting... (0)
Jan 20 09:41:42 np0005588920 systemd[1]: libpod-9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb.scope: Deactivated successfully.
Jan 20 09:41:42 np0005588920 podman[252704]: 2026-01-20 14:41:42.641522235 +0000 UTC m=+0.055579340 container died 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.652 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.664 226890 INFO nova.virt.libvirt.driver [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Instance destroyed successfully.#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.665 226890 DEBUG nova.objects.instance [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lazy-loading 'resources' on Instance uuid 8de70a62-d30c-4aa1-90fd-d5f6c551d606 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:42 np0005588920 systemd[1]: var-lib-containers-storage-overlay-a38e62ca8171a0489b31f5206768b9af9fe786796c2305d23c1611ed7a0668f8-merged.mount: Deactivated successfully.
Jan 20 09:41:42 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb-userdata-shm.mount: Deactivated successfully.
Jan 20 09:41:42 np0005588920 podman[252704]: 2026-01-20 14:41:42.68579577 +0000 UTC m=+0.099852875 container cleanup 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:41:42 np0005588920 systemd[1]: libpod-conmon-9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb.scope: Deactivated successfully.
Jan 20 09:41:42 np0005588920 podman[252746]: 2026-01-20 14:41:42.744358552 +0000 UTC m=+0.037742033 container remove 9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.749 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d254e0a1-5ba9-4960-9169-293a58d74d31]: (4, ('Tue Jan 20 02:41:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0 (9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb)\n9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb\nTue Jan 20 02:41:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0 (9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb)\n9eb612b7f1936a97c8be54d158429b9a0866d7815e7d1f02c07a00be32ca32bb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.752 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc13069-725b-4253-b8e4-4a021663a0da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.753 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb86544d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.754 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 kernel: tapfb86544d-30: left promiscuous mode
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.771 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.773 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4f0777-aa91-4a63-b5d8-02b7078e9c62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.773 226890 DEBUG nova.virt.libvirt.vif [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-280212278',display_name='tempest-InstanceActionsNegativeTestJSON-server-280212278',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-280212278',id=72,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='646f2240b79c44f08af243493552cf0e',ramdisk_id='',reservation_id='r-auq5vcve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1932198437',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1932198437-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:27Z,user_data=None,user_id='6f68701c3f984f11981d5e1ddaa6f093',uuid=8de70a62-d30c-4aa1-90fd-d5f6c551d606,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.774 226890 DEBUG nova.network.os_vif_util [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converting VIF {"id": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "address": "fa:16:3e:74:90:ac", "network": {"id": "fb86544d-3856-43a6-b7ec-b1e8666198f0", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1876154185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "646f2240b79c44f08af243493552cf0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de40e08-07", "ovs_interfaceid": "4de40e08-0741-4eb2-99b6-d4cc48958b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.774 226890 DEBUG nova.network.os_vif_util [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.775 226890 DEBUG os_vif [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:41:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:41:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.776 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.777 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4de40e08-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.778 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.785 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb7f472-2c34-4618-a3ec-7c96f556ba07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 nova_compute[226886]: 2026-01-20 14:41:42.786 226890 INFO os_vif [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:90:ac,bridge_name='br-int',has_traffic_filtering=True,id=4de40e08-0741-4eb2-99b6-d4cc48958b1e,network=Network(fb86544d-3856-43a6-b7ec-b1e8666198f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de40e08-07')#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.786 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[808488bc-6f0f-4f8c-9de5-6ac45fa17d56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.800 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b55d8a-0e9e-4bc2-9132-fb0c25f5a796]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512403, 'reachable_time': 35596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252764, 'error': None, 'target': 'ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.803 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb86544d-3856-43a6-b7ec-b1e8666198f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:41:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:42.803 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0b205bba-7b6b-440a-9e5d-f7256b5661f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:42 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfb86544d\x2d3856\x2d43a6\x2db7ec\x2db1e8666198f0.mount: Deactivated successfully.
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.143 226890 INFO nova.virt.libvirt.driver [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Deleting instance files /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606_del#033[00m
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.144 226890 INFO nova.virt.libvirt.driver [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Deletion of /var/lib/nova/instances/8de70a62-d30c-4aa1-90fd-d5f6c551d606_del complete#033[00m
Jan 20 09:41:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:43.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.445 226890 INFO nova.compute.manager [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Took 1.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.445 226890 DEBUG oslo.service.loopingcall [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.445 226890 DEBUG nova.compute.manager [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:41:43 np0005588920 nova_compute[226886]: 2026-01-20 14:41:43.446 226890 DEBUG nova.network.neutron [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:41:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:43.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:45 np0005588920 nova_compute[226886]: 2026-01-20 14:41:45.152 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:45.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.385 226890 DEBUG nova.compute.manager [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-unplugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.385 226890 DEBUG oslo_concurrency.lockutils [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.386 226890 DEBUG oslo_concurrency.lockutils [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.386 226890 DEBUG oslo_concurrency.lockutils [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.386 226890 DEBUG nova.compute.manager [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] No waiting events found dispatching network-vif-unplugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:46 np0005588920 nova_compute[226886]: 2026-01-20 14:41:46.386 226890 DEBUG nova.compute.manager [req-88e68e13-6b59-4301-a544-8e53784eb4f0 req-009984ff-89ab-4717-a317-821ea4a553b3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-unplugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:41:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:47.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:47 np0005588920 nova_compute[226886]: 2026-01-20 14:41:47.778 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:47.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:48.001 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:48.002 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.003 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.102 226890 DEBUG nova.network.neutron [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.141 226890 INFO nova.compute.manager [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Took 4.70 seconds to deallocate network for instance.#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.226 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.227 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.261 226890 DEBUG nova.compute.manager [req-7cdb3eae-27a7-426e-8c98-9e846f2e6ab2 req-63390a95-9c95-4e28-943d-b1ce3212b5c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-deleted-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.312 226890 DEBUG oslo_concurrency.processutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3821528778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.780 226890 DEBUG oslo_concurrency.processutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:48 np0005588920 nova_compute[226886]: 2026-01-20 14:41:48.787 226890 DEBUG nova.compute.provider_tree [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:49.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:49 np0005588920 nova_compute[226886]: 2026-01-20 14:41:49.688 226890 DEBUG nova.scheduler.client.report [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:49.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.254 226890 DEBUG nova.compute.manager [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.254 226890 DEBUG oslo_concurrency.lockutils [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.255 226890 DEBUG oslo_concurrency.lockutils [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.255 226890 DEBUG oslo_concurrency.lockutils [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.255 226890 DEBUG nova.compute.manager [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] No waiting events found dispatching network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.256 226890 WARNING nova.compute.manager [req-49a55e00-db2c-4d94-9f07-98c3c5651f32 req-a95b3b4b-c23a-464c-844b-8dc899d94cfe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Received unexpected event network-vif-plugged-4de40e08-0741-4eb2-99b6-d4cc48958b1e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:41:50 np0005588920 nova_compute[226886]: 2026-01-20 14:41:50.608 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:41:51.003 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:51.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:51.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:52 np0005588920 nova_compute[226886]: 2026-01-20 14:41:52.181 226890 INFO nova.scheduler.client.report [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Deleted allocations for instance 8de70a62-d30c-4aa1-90fd-d5f6c551d606#033[00m
Jan 20 09:41:52 np0005588920 nova_compute[226886]: 2026-01-20 14:41:52.307 226890 DEBUG oslo_concurrency.lockutils [None req-5578920c-7c1f-4852-b6d3-2ea0bc35dedb 6f68701c3f984f11981d5e1ddaa6f093 646f2240b79c44f08af243493552cf0e - - default default] Lock "8de70a62-d30c-4aa1-90fd-d5f6c551d606" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:52 np0005588920 nova_compute[226886]: 2026-01-20 14:41:52.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:53.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:53.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:55 np0005588920 nova_compute[226886]: 2026-01-20 14:41:55.153 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:55.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:55.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:57 np0005588920 podman[252856]: 2026-01-20 14:41:57.059166242 +0000 UTC m=+0.134625084 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 20 09:41:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:57.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:57 np0005588920 nova_compute[226886]: 2026-01-20 14:41:57.662 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920102.660436, 8de70a62-d30c-4aa1-90fd-d5f6c551d606 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:57 np0005588920 nova_compute[226886]: 2026-01-20 14:41:57.662 226890 INFO nova.compute.manager [-] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:41:57 np0005588920 nova_compute[226886]: 2026-01-20 14:41:57.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:57 np0005588920 nova_compute[226886]: 2026-01-20 14:41:57.807 226890 DEBUG nova.compute.manager [None req-5319b357-7f0d-4ea0-8a9d-b041b54b832d - - - - - -] [instance: 8de70a62-d30c-4aa1-90fd-d5f6c551d606] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:57.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:59.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:41:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:59.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:00 np0005588920 nova_compute[226886]: 2026-01-20 14:42:00.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:01.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:42:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:01.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.189 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.189 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.225 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.363 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.364 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.390 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.391 226890 INFO nova.compute.claims [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:42:02 np0005588920 nova_compute[226886]: 2026-01-20 14:42:02.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:03.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:03 np0005588920 nova_compute[226886]: 2026-01-20 14:42:03.554 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:03.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4238600245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.263 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.710s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.269 226890 DEBUG nova.compute.provider_tree [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.291 226890 DEBUG nova.scheduler.client.report [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.322 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.323 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.398 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.399 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.429 226890 INFO nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.446 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.582 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.584 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.584 226890 INFO nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Creating image(s)#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.615 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.652 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.689 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.694 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.792 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.793 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.793 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.793 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.822 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:04 np0005588920 nova_compute[226886]: 2026-01-20 14:42:04.826 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 55628882-70c9-4b43-b653-9983ba87ca0d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:05 np0005588920 nova_compute[226886]: 2026-01-20 14:42:05.014 226890 DEBUG nova.policy [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a00517a957e4ceb8564cbf1dfa15ee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13c0d93976f745dba4ab050770ccaae6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:05 np0005588920 nova_compute[226886]: 2026-01-20 14:42:05.087 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 55628882-70c9-4b43-b653-9983ba87ca0d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:05 np0005588920 nova_compute[226886]: 2026-01-20 14:42:05.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:05 np0005588920 nova_compute[226886]: 2026-01-20 14:42:05.179 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] resizing rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:42:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:05.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:05 np0005588920 nova_compute[226886]: 2026-01-20 14:42:05.306 226890 DEBUG nova.objects.instance [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'migration_context' on Instance uuid 55628882-70c9-4b43-b653-9983ba87ca0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:05.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:07.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.674 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.675 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Ensure instance console log exists: /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.676 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.676 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.677 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.729997) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127730077, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2175, "num_deletes": 251, "total_data_size": 5092944, "memory_usage": 5161232, "flush_reason": "Manual Compaction"}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127761816, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3307047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36733, "largest_seqno": 38903, "table_properties": {"data_size": 3298227, "index_size": 5378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18936, "raw_average_key_size": 20, "raw_value_size": 3280465, "raw_average_value_size": 3554, "num_data_blocks": 234, "num_entries": 923, "num_filter_entries": 923, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919948, "oldest_key_time": 1768919948, "file_creation_time": 1768920127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 31893 microseconds, and 8126 cpu microseconds.
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.761883) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3307047 bytes OK
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.761907) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.763753) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.763772) EVENT_LOG_v1 {"time_micros": 1768920127763766, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.763792) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5083216, prev total WAL file size 5083927, number of live WAL files 2.
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.765552) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3229KB)], [69(8242KB)]
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127765603, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11747137, "oldest_snapshot_seqno": -1}
Jan 20 09:42:07 np0005588920 nova_compute[226886]: 2026-01-20 14:42:07.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6459 keys, 9845900 bytes, temperature: kUnknown
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127829562, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 9845900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9802929, "index_size": 25709, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16197, "raw_key_size": 164929, "raw_average_key_size": 25, "raw_value_size": 9687253, "raw_average_value_size": 1499, "num_data_blocks": 1030, "num_entries": 6459, "num_filter_entries": 6459, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.829817) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 9845900 bytes
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.838021) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.4 rd, 153.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 8.0 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 6978, records dropped: 519 output_compression: NoCompression
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.838053) EVENT_LOG_v1 {"time_micros": 1768920127838041, "job": 42, "event": "compaction_finished", "compaction_time_micros": 64037, "compaction_time_cpu_micros": 25517, "output_level": 6, "num_output_files": 1, "total_output_size": 9845900, "num_input_records": 6978, "num_output_records": 6459, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127838721, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127840038, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.765447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840080) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:07.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:08 np0005588920 nova_compute[226886]: 2026-01-20 14:42:08.070 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully created port: 2251427b-053e-477b-919c-0a2be96a4c01 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:08 np0005588920 nova_compute[226886]: 2026-01-20 14:42:08.262 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:09.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:09 np0005588920 podman[253071]: 2026-01-20 14:42:09.95127557 +0000 UTC m=+0.045132969 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 20 09:42:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:09.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:10 np0005588920 nova_compute[226886]: 2026-01-20 14:42:10.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:11.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:11 np0005588920 nova_compute[226886]: 2026-01-20 14:42:11.855 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully created port: 01ad691e-a0f7-4b59-9b84-3486189332a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:11.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:12 np0005588920 nova_compute[226886]: 2026-01-20 14:42:12.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.159133) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133159182, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 310, "num_deletes": 255, "total_data_size": 122037, "memory_usage": 128784, "flush_reason": "Manual Compaction"}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133161891, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 79974, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38908, "largest_seqno": 39213, "table_properties": {"data_size": 78047, "index_size": 155, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4925, "raw_average_key_size": 17, "raw_value_size": 74148, "raw_average_value_size": 264, "num_data_blocks": 7, "num_entries": 280, "num_filter_entries": 280, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920127, "oldest_key_time": 1768920127, "file_creation_time": 1768920133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 2781 microseconds, and 1054 cpu microseconds.
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.161918) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 79974 bytes OK
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.161931) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.163149) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.163159) EVENT_LOG_v1 {"time_micros": 1768920133163156, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.163170) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 119792, prev total WAL file size 119792, number of live WAL files 2.
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.163454) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303131' seq:72057594037927935, type:22 .. '6C6F676D0031323632' seq:0, type:0; will stop at (end)
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(78KB)], [72(9615KB)]
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133163488, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 9925874, "oldest_snapshot_seqno": -1}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6221 keys, 9792445 bytes, temperature: kUnknown
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133247656, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 9792445, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9750545, "index_size": 25214, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 160917, "raw_average_key_size": 25, "raw_value_size": 9638549, "raw_average_value_size": 1549, "num_data_blocks": 1005, "num_entries": 6221, "num_filter_entries": 6221, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:13.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.247930) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 9792445 bytes
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.255745) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 117.8 rd, 116.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.4 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(246.6) write-amplify(122.4) OK, records in: 6739, records dropped: 518 output_compression: NoCompression
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.255781) EVENT_LOG_v1 {"time_micros": 1768920133255765, "job": 44, "event": "compaction_finished", "compaction_time_micros": 84282, "compaction_time_cpu_micros": 20920, "output_level": 6, "num_output_files": 1, "total_output_size": 9792445, "num_input_records": 6739, "num_output_records": 6221, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133256390, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133259870, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.163373) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259998) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:42:13.260000) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588920 nova_compute[226886]: 2026-01-20 14:42:13.802 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully created port: 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:13.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:15 np0005588920 nova_compute[226886]: 2026-01-20 14:42:15.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:15.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:15.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:16 np0005588920 nova_compute[226886]: 2026-01-20 14:42:16.283 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully updated port: 2251427b-053e-477b-919c-0a2be96a4c01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:16.445 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:16.445 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:16.446 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:17.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.300 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.301 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.335 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.448 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.449 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.459 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.460 226890 INFO nova.compute.claims [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.654 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:17 np0005588920 nova_compute[226886]: 2026-01-20 14:42:17.789 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:17.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.074 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.081 226890 DEBUG nova.compute.provider_tree [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.103 226890 DEBUG nova.scheduler.client.report [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.133 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.134 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.216 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.216 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.238 226890 INFO nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.265 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.407 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.409 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.410 226890 INFO nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Creating image(s)#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.448 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.487 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.526 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.532 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.595 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully updated port: 01ad691e-a0f7-4b59-9b84-3486189332a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.601 226890 DEBUG nova.compute.manager [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-changed-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.601 226890 DEBUG nova.compute.manager [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing instance network info cache due to event network-changed-2251427b-053e-477b-919c-0a2be96a4c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.602 226890 DEBUG oslo_concurrency.lockutils [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.602 226890 DEBUG oslo_concurrency.lockutils [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.602 226890 DEBUG nova.network.neutron [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing network info cache for port 2251427b-053e-477b-919c-0a2be96a4c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.627 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.627 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.628 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.628 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.656 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.660 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 525b8695-a4df-46c5-875a-42d3b18b78be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.745 226890 DEBUG nova.policy [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff99fc8eda0640928c6e82981dacb266', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b95747114ab4043b93a260387199c91', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:18 np0005588920 nova_compute[226886]: 2026-01-20 14:42:18.966 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 525b8695-a4df-46c5-875a-42d3b18b78be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.306s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.058 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] resizing rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.181 226890 DEBUG nova.objects.instance [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'migration_context' on Instance uuid 525b8695-a4df-46c5-875a-42d3b18b78be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.213 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.214 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Ensure instance console log exists: /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.215 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.216 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.218 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:19.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:19 np0005588920 nova_compute[226886]: 2026-01-20 14:42:19.616 226890 DEBUG nova.network.neutron [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:19.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:20 np0005588920 nova_compute[226886]: 2026-01-20 14:42:20.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:20 np0005588920 nova_compute[226886]: 2026-01-20 14:42:20.825 226890 DEBUG nova.compute.manager [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-changed-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:20 np0005588920 nova_compute[226886]: 2026-01-20 14:42:20.825 226890 DEBUG nova.compute.manager [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing instance network info cache due to event network-changed-01ad691e-a0f7-4b59-9b84-3486189332a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:20 np0005588920 nova_compute[226886]: 2026-01-20 14:42:20.826 226890 DEBUG oslo_concurrency.lockutils [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:21.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.514 226890 DEBUG nova.network.neutron [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.540 226890 DEBUG oslo_concurrency.lockutils [req-b3918137-a073-415c-9c82-7b0d2282be42 req-b2a93b21-24f2-49f8-b8b6-811294bcd38b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.540 226890 DEBUG oslo_concurrency.lockutils [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.541 226890 DEBUG nova.network.neutron [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing network info cache for port 01ad691e-a0f7-4b59-9b84-3486189332a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.552 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Successfully updated port: 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.565 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:21 np0005588920 nova_compute[226886]: 2026-01-20 14:42:21.955 226890 DEBUG nova.network.neutron [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:21.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.267 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Successfully created port: 01b95c4f-9db6-469f-9458-8c279a5778f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.518 226890 DEBUG nova.network.neutron [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.549 226890 DEBUG oslo_concurrency.lockutils [req-0cef002d-4f84-4d05-af57-d12af3485ad9 req-2cd00fba-e63f-45d8-ab13-d65e88e602c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.549 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquired lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.549 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.773 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.791 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.983 226890 DEBUG nova.compute.manager [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-changed-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.984 226890 DEBUG nova.compute.manager [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing instance network info cache due to event network-changed-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:22 np0005588920 nova_compute[226886]: 2026-01-20 14:42:22.985 226890 DEBUG oslo_concurrency.lockutils [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:42:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:23.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:42:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:23.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.786 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.786 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:42:24 np0005588920 nova_compute[226886]: 2026-01-20 14:42:24.786 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.167 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:25.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.824 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Successfully updated port: 01b95c4f-9db6-469f-9458-8c279a5778f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.844 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.844 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquired lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.844 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.989 226890 DEBUG nova.compute.manager [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-changed-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.989 226890 DEBUG nova.compute.manager [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Refreshing instance network info cache due to event network-changed-01b95c4f-9db6-469f-9458-8c279a5778f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:25 np0005588920 nova_compute[226886]: 2026-01-20 14:42:25.989 226890 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:25.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:26 np0005588920 nova_compute[226886]: 2026-01-20 14:42:26.567 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:27.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:27 np0005588920 nova_compute[226886]: 2026-01-20 14:42:27.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:27 np0005588920 nova_compute[226886]: 2026-01-20 14:42:27.792 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:27.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:28 np0005588920 podman[253279]: 2026-01-20 14:42:28.022016709 +0000 UTC m=+0.114261666 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.515 226890 DEBUG nova.network.neutron [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [{"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.554 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Releasing lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.555 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance network_info: |[{"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.555 226890 DEBUG oslo_concurrency.lockutils [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.555 226890 DEBUG nova.network.neutron [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Refreshing network info cache for port 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.559 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Start _get_guest_xml network_info=[{"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.564 226890 WARNING nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.569 226890 DEBUG nova.virt.libvirt.host [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.570 226890 DEBUG nova.virt.libvirt.host [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.577 226890 DEBUG nova.virt.libvirt.host [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.578 226890 DEBUG nova.virt.libvirt.host [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.579 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.579 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.580 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.580 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.580 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.580 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.580 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.581 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.581 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.581 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.581 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.581 226890 DEBUG nova.virt.hardware [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.584 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.929 226890 DEBUG nova.network.neutron [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Updating instance_info_cache with network_info: [{"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.968 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Releasing lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.969 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Instance network_info: |[{"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.969 226890 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.970 226890 DEBUG nova.network.neutron [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Refreshing network info cache for port 01b95c4f-9db6-469f-9458-8c279a5778f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.975 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Start _get_guest_xml network_info=[{"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.981 226890 WARNING nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.986 226890 DEBUG nova.virt.libvirt.host [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.987 226890 DEBUG nova.virt.libvirt.host [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.990 226890 DEBUG nova.virt.libvirt.host [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.994 226890 DEBUG nova.virt.libvirt.host [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.995 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.996 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.996 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.997 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.997 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.997 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.998 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.998 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.998 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.999 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:42:28 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.999 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:28.999 226890 DEBUG nova.virt.hardware [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.004 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/311125799' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.048 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.096 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.101 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:29.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4237730022' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.468 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.496 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.500 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3877248440' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.561 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.563 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.563 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.564 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.564 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.565 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.565 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.565 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.566 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.566 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.567 226890 DEBUG nova.objects.instance [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 55628882-70c9-4b43-b653-9983ba87ca0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.590 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <uuid>55628882-70c9-4b43-b653-9983ba87ca0d</uuid>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <name>instance-00000049</name>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersTestMultiNic-server-74208518</nova:name>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:42:28</nova:creationTime>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:user uuid="6a00517a957e4ceb8564cbf1dfa15ee2">tempest-ServersTestMultiNic-1634662961-project-member</nova:user>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:project uuid="13c0d93976f745dba4ab050770ccaae6">tempest-ServersTestMultiNic-1634662961</nova:project>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:port uuid="2251427b-053e-477b-919c-0a2be96a4c01">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.171" ipVersion="4"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:port uuid="01ad691e-a0f7-4b59-9b84-3486189332a5">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.1.236" ipVersion="4"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:port uuid="7eea52e4-93c4-48e5-9db5-b9d834c2bdbd">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.107" ipVersion="4"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="serial">55628882-70c9-4b43-b653-9983ba87ca0d</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="uuid">55628882-70c9-4b43-b653-9983ba87ca0d</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/55628882-70c9-4b43-b653-9983ba87ca0d_disk">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/55628882-70c9-4b43-b653-9983ba87ca0d_disk.config">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:89:1e:d5"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="tap2251427b-05"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:54:02:77"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="tap01ad691e-a0"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:5a:cf:4c"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="tap7eea52e4-93"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/console.log" append="off"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:42:29 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:42:29 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.590 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Preparing to wait for external event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.590 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.591 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.591 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.591 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Preparing to wait for external event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.591 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.591 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.592 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.592 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Preparing to wait for external event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.592 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.592 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.592 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.593 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.593 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.593 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.594 226890 DEBUG os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.594 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.595 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.606 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.606 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2251427b-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.607 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2251427b-05, col_values=(('external_ids', {'iface-id': '2251427b-053e-477b-919c-0a2be96a4c01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:1e:d5', 'vm-uuid': '55628882-70c9-4b43-b653-9983ba87ca0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.655 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 NetworkManager[49076]: <info>  [1768920149.6556] manager: (tap2251427b-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.657 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.661 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.662 226890 INFO os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05')#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.662 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.663 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.663 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.663 226890 DEBUG os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.664 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.664 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.666 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01ad691e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.666 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01ad691e-a0, col_values=(('external_ids', {'iface-id': '01ad691e-a0f7-4b59-9b84-3486189332a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:02:77', 'vm-uuid': '55628882-70c9-4b43-b653-9983ba87ca0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.667 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 NetworkManager[49076]: <info>  [1768920149.6682] manager: (tap01ad691e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.673 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.674 226890 INFO os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0')#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.674 226890 DEBUG nova.virt.libvirt.vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:04Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.674 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.675 226890 DEBUG nova.network.os_vif_util [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.675 226890 DEBUG os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.675 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.675 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.676 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.678 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.678 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7eea52e4-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.678 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7eea52e4-93, col_values=(('external_ids', {'iface-id': '7eea52e4-93c4-48e5-9db5-b9d834c2bdbd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:cf:4c', 'vm-uuid': '55628882-70c9-4b43-b653-9983ba87ca0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.679 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 NetworkManager[49076]: <info>  [1768920149.6799] manager: (tap7eea52e4-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.681 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.687 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.687 226890 INFO os_vif [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93')#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.762 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.763 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.763 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No VIF found with MAC fa:16:3e:89:1e:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.763 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No VIF found with MAC fa:16:3e:54:02:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.763 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No VIF found with MAC fa:16:3e:5a:cf:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.764 226890 INFO nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Using config drive#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.785 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2794071667' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.956 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.957 226890 DEBUG nova.virt.libvirt.vif [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1173241717',display_name='tempest-ListServerFiltersTestJSON-instance-1173241717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1173241717',id=76,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-m1qpexqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:18Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=525b8695-a4df-46c5-875a-42d3b18b78be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.957 226890 DEBUG nova.network.os_vif_util [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.958 226890 DEBUG nova.network.os_vif_util [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.959 226890 DEBUG nova.objects.instance [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'pci_devices' on Instance uuid 525b8695-a4df-46c5-875a-42d3b18b78be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.975 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <uuid>525b8695-a4df-46c5-875a-42d3b18b78be</uuid>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <name>instance-0000004c</name>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <memory>196608</memory>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1173241717</nova:name>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:42:28</nova:creationTime>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.micro">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:memory>192</nova:memory>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:user uuid="ff99fc8eda0640928c6e82981dacb266">tempest-ListServerFiltersTestJSON-2126845308-project-member</nova:user>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:project uuid="4b95747114ab4043b93a260387199c91">tempest-ListServerFiltersTestJSON-2126845308</nova:project>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <nova:port uuid="01b95c4f-9db6-469f-9458-8c279a5778f0">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="serial">525b8695-a4df-46c5-875a-42d3b18b78be</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="uuid">525b8695-a4df-46c5-875a-42d3b18b78be</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/525b8695-a4df-46c5-875a-42d3b18b78be_disk">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/525b8695-a4df-46c5-875a-42d3b18b78be_disk.config">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:d7:80:84"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <target dev="tap01b95c4f-9d"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/console.log" append="off"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:42:29 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:42:29 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:42:29 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:42:29 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.976 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Preparing to wait for external event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.976 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.976 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.976 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.977 226890 DEBUG nova.virt.libvirt.vif [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1173241717',display_name='tempest-ListServerFiltersTestJSON-instance-1173241717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1173241717',id=76,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-m1qpexqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:18Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=525b8695-a4df-46c5-875a-42d3b18b78be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.977 226890 DEBUG nova.network.os_vif_util [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.978 226890 DEBUG nova.network.os_vif_util [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.978 226890 DEBUG os_vif [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.979 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.979 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.979 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.981 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.982 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01b95c4f-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.982 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01b95c4f-9d, col_values=(('external_ids', {'iface-id': '01b95c4f-9db6-469f-9458-8c279a5778f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:80:84', 'vm-uuid': '525b8695-a4df-46c5-875a-42d3b18b78be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.983 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 NetworkManager[49076]: <info>  [1768920149.9843] manager: (tap01b95c4f-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.985 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.994 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588920 nova_compute[226886]: 2026-01-20 14:42:29.994 226890 INFO os_vif [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d')#033[00m
Jan 20 09:42:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:30.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.045 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.045 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.045 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No VIF found with MAC fa:16:3e:d7:80:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.046 226890 INFO nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Using config drive#033[00m
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.069 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:30 np0005588920 nova_compute[226886]: 2026-01-20 14:42:30.169 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.101 226890 INFO nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Creating config drive at /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.110 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpku2yrazw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.260 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpku2yrazw" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:31.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.291 226890 DEBUG nova.storage.rbd_utils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 55628882-70c9-4b43-b653-9983ba87ca0d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.296 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config 55628882-70c9-4b43-b653-9983ba87ca0d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.437 226890 DEBUG oslo_concurrency.processutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config 55628882-70c9-4b43-b653-9983ba87ca0d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.438 226890 INFO nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Deleting local config drive /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d/disk.config because it was imported into RBD.#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.4905] manager: (tap2251427b-05): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Jan 20 09:42:31 np0005588920 kernel: tap2251427b-05: entered promiscuous mode
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00268|binding|INFO|Claiming lport 2251427b-053e-477b-919c-0a2be96a4c01 for this chassis.
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00269|binding|INFO|2251427b-053e-477b-919c-0a2be96a4c01: Claiming fa:16:3e:89:1e:d5 10.100.0.171
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5043] manager: (tap01ad691e-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.502 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 kernel: tap01ad691e-a0: entered promiscuous mode
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.515 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:1e:d5 10.100.0.171'], port_security=['fa:16:3e:89:1e:d5 10.100.0.171'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.171/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=409644b9-9c91-489f-870e-52aa4bf20678, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2251427b-053e-477b-919c-0a2be96a4c01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.516 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2251427b-053e-477b-919c-0a2be96a4c01 in datapath 6e7523a4-fd95-46c8-82f2-10c4527c1b7d bound to our chassis#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.517 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e7523a4-fd95-46c8-82f2-10c4527c1b7d#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5202] manager: (tap7eea52e4-93): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 20 09:42:31 np0005588920 systemd-udevd[253541]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:31 np0005588920 systemd-udevd[253542]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:31 np0005588920 systemd-udevd[253540]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.532 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[935cc31a-45d0-4eab-ba06-f157b41208ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.534 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6e7523a4-f1 in ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.537 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6e7523a4-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.537 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8c64dad8-6357-40e1-b968-a92147b1b545]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.539 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea37828-01c8-4ce3-8519-851d2f0a0c77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5476] device (tap2251427b-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5486] device (tap2251427b-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5543] device (tap01ad691e-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5557] device (tap01ad691e-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.560 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[702dc166-7500-419e-b310-c2f18a90e828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 systemd-machined[196121]: New machine qemu-31-instance-00000049.
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00270|binding|INFO|Claiming lport 01ad691e-a0f7-4b59-9b84-3486189332a5 for this chassis.
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00271|binding|INFO|01ad691e-a0f7-4b59-9b84-3486189332a5: Claiming fa:16:3e:54:02:77 10.100.1.236
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.573 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e3498f01-49d3-459f-a323-4a4e86832f55]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 systemd[1]: Started Virtual Machine qemu-31-instance-00000049.
Jan 20 09:42:31 np0005588920 kernel: tap7eea52e4-93: entered promiscuous mode
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5804] device (tap7eea52e4-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.580 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:02:77 10.100.1.236'], port_security=['fa:16:3e:54:02:77 10.100.1.236'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.236/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a896e9d-306e-4f99-81e1-986b217a807d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f570666c-7b28-45d1-80d8-f28c7296833d, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=01ad691e-a0f7-4b59-9b84-3486189332a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.5816] device (tap7eea52e4-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00272|binding|INFO|Claiming lport 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd for this chassis.
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00273|binding|INFO|7eea52e4-93c4-48e5-9db5-b9d834c2bdbd: Claiming fa:16:3e:5a:cf:4c 10.100.0.107
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00274|binding|INFO|Setting lport 2251427b-053e-477b-919c-0a2be96a4c01 ovn-installed in OVS
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.592 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.608 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[29dc1af3-7ca0-4301-9098-d417d04ca657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.6187] manager: (tap6e7523a4-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.618 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[055142a2-50f0-4a2f-93f8-819990ed541b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00275|binding|INFO|Setting lport 2251427b-053e-477b-919c-0a2be96a4c01 up in Southbound
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.627 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:cf:4c 10.100.0.107'], port_security=['fa:16:3e:5a:cf:4c 10.100.0.107'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.107/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=409644b9-9c91-489f-870e-52aa4bf20678, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00276|binding|INFO|Setting lport 01ad691e-a0f7-4b59-9b84-3486189332a5 ovn-installed in OVS
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00277|binding|INFO|Setting lport 01ad691e-a0f7-4b59-9b84-3486189332a5 up in Southbound
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00278|binding|INFO|Setting lport 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd ovn-installed in OVS
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00279|binding|INFO|Setting lport 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd up in Southbound
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.659 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f5ff37-0db0-465d-bdbc-76c5e9596c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.662 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8d119efd-109d-4a9d-b378-6b19306110e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.6845] device (tap6e7523a4-f0): carrier: link connected
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.689 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d32d4939-ea82-4303-af4b-28f0d2c8e8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.707 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed4dfe8-209e-45bd-a59b-88b1d4d35f33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e7523a4-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519337, 'reachable_time': 28108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253582, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.724 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[75a5891c-99ae-4d42-a6a6-cd08197e29ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:9303'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519337, 'tstamp': 519337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253583, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.740 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[89c4b76b-0bc5-4877-9f7d-aafe6ae47775]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e7523a4-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519337, 'reachable_time': 28108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253584, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.752 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.752 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.776 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[35770c1a-d29a-4865-86bc-21d04ad5ca04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.844 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c16977-ff73-49ee-92a1-8fca637748e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.845 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e7523a4-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.845 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.846 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e7523a4-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.847 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 NetworkManager[49076]: <info>  [1768920151.8483] manager: (tap6e7523a4-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 20 09:42:31 np0005588920 kernel: tap6e7523a4-f0: entered promiscuous mode
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.854 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.855 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e7523a4-f0, col_values=(('external_ids', {'iface-id': '73e18e36-18cb-4146-ae19-4b33b97a050a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.856 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:31Z|00280|binding|INFO|Releasing lport 73e18e36-18cb-4146-ae19-4b33b97a050a from this chassis (sb_readonly=0)
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 nova_compute[226886]: 2026-01-20 14:42:31.876 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.877 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6e7523a4-fd95-46c8-82f2-10c4527c1b7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6e7523a4-fd95-46c8-82f2-10c4527c1b7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.877 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9c4b28e7-49cb-4dea-af78-1cc43ee0cb93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.878 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-6e7523a4-fd95-46c8-82f2-10c4527c1b7d
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/6e7523a4-fd95-46c8-82f2-10c4527c1b7d.pid.haproxy
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 6e7523a4-fd95-46c8-82f2-10c4527c1b7d
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:42:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:31.879 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'env', 'PROCESS_TAG=haproxy-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6e7523a4-fd95-46c8-82f2-10c4527c1b7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:42:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:32.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.014 226890 INFO nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Creating config drive at /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.019 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkxgrhvav execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.153 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkxgrhvav" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.189 226890 DEBUG nova.storage.rbd_utils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 525b8695-a4df-46c5-875a-42d3b18b78be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.193 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config 525b8695-a4df-46c5-875a-42d3b18b78be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:32 np0005588920 podman[253674]: 2026-01-20 14:42:32.247536568 +0000 UTC m=+0.049853041 container create a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:42:32 np0005588920 systemd[1]: Started libpod-conmon-a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c.scope.
Jan 20 09:42:32 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:42:32 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b60ffe80042a32b7f20eeef822902410f0f9eb9b9517bf405b0a09859f02306/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:42:32 np0005588920 podman[253674]: 2026-01-20 14:42:32.220033661 +0000 UTC m=+0.022350154 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.328 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920152.3265283, 55628882-70c9-4b43-b653-9983ba87ca0d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:32 np0005588920 podman[253674]: 2026-01-20 14:42:32.327563709 +0000 UTC m=+0.129880232 container init a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.329 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] VM Started (Lifecycle Event)#033[00m
Jan 20 09:42:32 np0005588920 podman[253674]: 2026-01-20 14:42:32.335393158 +0000 UTC m=+0.137709671 container start a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.357 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:32 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [NOTICE]   (253722) : New worker (253724) forked
Jan 20 09:42:32 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [NOTICE]   (253722) : Loading success.
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.366 226890 DEBUG oslo_concurrency.processutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config 525b8695-a4df-46c5-875a-42d3b18b78be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.367 226890 INFO nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Deleting local config drive /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be/disk.config because it was imported into RBD.#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.372 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920152.3268113, 55628882-70c9-4b43-b653-9983ba87ca0d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.372 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.392 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.394 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 01ad691e-a0f7-4b59-9b84-3486189332a5 in datapath 1a896e9d-306e-4f99-81e1-986b217a807d unbound from our chassis#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.397 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a896e9d-306e-4f99-81e1-986b217a807d#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.397 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.409 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d60ab7e8-71e7-46a9-bbd7-519e9ba83a0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.411 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a896e9d-31 in ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.414 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a896e9d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.414 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fc83d209-6f94-4ad4-a3ba-7a919d07414d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.415 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[186a6ef1-9279-44dc-8564-c70eea27c98e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 kernel: tap01b95c4f-9d: entered promiscuous mode
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.4195] manager: (tap01b95c4f-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Jan 20 09:42:32 np0005588920 systemd-udevd[253566]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:32Z|00281|binding|INFO|Claiming lport 01b95c4f-9db6-469f-9458-8c279a5778f0 for this chassis.
Jan 20 09:42:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:32Z|00282|binding|INFO|01b95c4f-9db6-469f-9458-8c279a5778f0: Claiming fa:16:3e:d7:80:84 10.100.0.14
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.420 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.4343] device (tap01b95c4f-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.434 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:80:84 10.100.0.14'], port_security=['fa:16:3e:d7:80:84 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '525b8695-a4df-46c5-875a-42d3b18b78be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b95747114ab4043b93a260387199c91', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f18b0222-78a5-4c37-8065-772dbe5c63e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80e2aa5b-ecb8-4e93-992f-baaef718dd34, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=01b95c4f-9db6-469f-9458-8c279a5778f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.434 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.4356] device (tap01b95c4f-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.433 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0b31f64d-ace1-43d3-9399-2ffa0c3f2b75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.438 226890 DEBUG nova.network.neutron [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updated VIF entry in instance network info cache for port 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.438 226890 DEBUG nova.network.neutron [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [{"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:32 np0005588920 systemd-machined[196121]: New machine qemu-32-instance-0000004c.
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.457 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cdafaa97-b2ab-4bc2-a382-e9ad3207052f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 systemd[1]: Started Virtual Machine qemu-32-instance-0000004c.
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.474 226890 DEBUG oslo_concurrency.lockutils [req-5d18b1e0-0183-4e69-9f8a-983b3863ef5d req-fcf4e23a-21ae-4d2e-be83-ec922b4ae070 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-55628882-70c9-4b43-b653-9983ba87ca0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.481 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:32Z|00283|binding|INFO|Setting lport 01b95c4f-9db6-469f-9458-8c279a5778f0 ovn-installed in OVS
Jan 20 09:42:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:32Z|00284|binding|INFO|Setting lport 01b95c4f-9db6-469f-9458-8c279a5778f0 up in Southbound
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.484 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.491 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[980ba112-4f37-4e88-a9fa-7f9b7552e0de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.495 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b721444-1e40-41a3-b446-6834790ffc12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.4971] manager: (tap1a896e9d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.525 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[57009e11-2b90-41a9-948e-8a4b13781fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.528 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c441c543-cac0-44eb-9ed6-0b8f2ec78f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.5503] device (tap1a896e9d-30): carrier: link connected
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.557 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a1e2ee-384c-4d54-bc03-5ff99edb6ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.575 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[35d0a19a-0205-425f-b06e-cbbb4814fd9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a896e9d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:34:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519423, 'reachable_time': 41325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253763, 'error': None, 'target': 'ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.593 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2e207e-2a29-4210-977e-522634bb3526]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:34bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519423, 'tstamp': 519423}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253764, 'error': None, 'target': 'ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.610 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39343f59-9452-49ce-a8cb-aa16a3930151]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a896e9d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:34:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519423, 'reachable_time': 41325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253765, 'error': None, 'target': 'ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.645 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[93f4f41a-e18d-44c0-815b-ee391c640b90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.669 226890 DEBUG nova.compute.manager [req-6fc98064-589e-4571-a57f-55ba3c0b2c2a req-5f97af2d-ccba-4313-84bc-4a0d0fc94756 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.670 226890 DEBUG oslo_concurrency.lockutils [req-6fc98064-589e-4571-a57f-55ba3c0b2c2a req-5f97af2d-ccba-4313-84bc-4a0d0fc94756 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.670 226890 DEBUG oslo_concurrency.lockutils [req-6fc98064-589e-4571-a57f-55ba3c0b2c2a req-5f97af2d-ccba-4313-84bc-4a0d0fc94756 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.670 226890 DEBUG oslo_concurrency.lockutils [req-6fc98064-589e-4571-a57f-55ba3c0b2c2a req-5f97af2d-ccba-4313-84bc-4a0d0fc94756 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.670 226890 DEBUG nova.compute.manager [req-6fc98064-589e-4571-a57f-55ba3c0b2c2a req-5f97af2d-ccba-4313-84bc-4a0d0fc94756 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Processing event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.714 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[15fef6e2-9687-459c-9634-ac8467a24ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.715 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a896e9d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.715 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.716 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a896e9d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588920 kernel: tap1a896e9d-30: entered promiscuous mode
Jan 20 09:42:32 np0005588920 NetworkManager[49076]: <info>  [1768920152.7182] manager: (tap1a896e9d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.719 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.720 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a896e9d-30, col_values=(('external_ids', {'iface-id': 'a9b1e0d1-42a9-412f-924b-0ee011c5c730'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:32Z|00285|binding|INFO|Releasing lport a9b1e0d1-42a9-412f-924b-0ee011c5c730 from this chassis (sb_readonly=0)
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.722 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.723 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a896e9d-306e-4f99-81e1-986b217a807d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a896e9d-306e-4f99-81e1-986b217a807d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.723 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f8454d-fac1-4694-ad1c-ac28633af0c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.724 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-1a896e9d-306e-4f99-81e1-986b217a807d
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/1a896e9d-306e-4f99-81e1-986b217a807d.pid.haproxy
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 1a896e9d-306e-4f99-81e1-986b217a807d
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:42:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:32.725 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d', 'env', 'PROCESS_TAG=haproxy-1a896e9d-306e-4f99-81e1-986b217a807d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a896e9d-306e-4f99-81e1-986b217a807d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.764 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.764 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.764 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.852 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920152.8511734, 525b8695-a4df-46c5-875a-42d3b18b78be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.852 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] VM Started (Lifecycle Event)#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.878 226890 DEBUG nova.network.neutron [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Updated VIF entry in instance network info cache for port 01b95c4f-9db6-469f-9458-8c279a5778f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.879 226890 DEBUG nova.network.neutron [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Updating instance_info_cache with network_info: [{"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.883 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.891 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920152.8551304, 525b8695-a4df-46c5-875a-42d3b18b78be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.893 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.897 226890 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-525b8695-a4df-46c5-875a-42d3b18b78be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.916 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.920 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:32 np0005588920 nova_compute[226886]: 2026-01-20 14:42:32.949 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:33 np0005588920 podman[253859]: 2026-01-20 14:42:33.102699239 +0000 UTC m=+0.043086952 container create c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:42:33 np0005588920 systemd[1]: Started libpod-conmon-c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96.scope.
Jan 20 09:42:33 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:42:33 np0005588920 podman[253859]: 2026-01-20 14:42:33.080250523 +0000 UTC m=+0.020638236 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:42:33 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3a426077250f76807b457026b1e012c2576eb0c862b757e8d88db3d21154bb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:42:33 np0005588920 podman[253859]: 2026-01-20 14:42:33.196446622 +0000 UTC m=+0.136834385 container init c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:42:33 np0005588920 podman[253859]: 2026-01-20 14:42:33.202864561 +0000 UTC m=+0.143252284 container start c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:42:33 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [NOTICE]   (253879) : New worker (253881) forked
Jan 20 09:42:33 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [NOTICE]   (253879) : Loading success.
Jan 20 09:42:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3354975338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.258 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd in datapath 6e7523a4-fd95-46c8-82f2-10c4527c1b7d unbound from our chassis#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.259 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.261 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e7523a4-fd95-46c8-82f2-10c4527c1b7d#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.275 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec5fd6c-9304-4343-bca7-f55f1876ef74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:33.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.306 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[81474d41-3188-40da-b057-6ee0a9a10626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.309 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cbedb373-e29c-45e9-a3b4-793711c91160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.344 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[86cd8c65-62f3-4025-b36c-ebbd219b6df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.374 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a960c649-f023-456a-a997-7faac5c96c19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e7523a4-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519337, 'reachable_time': 28108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253897, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.397 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[01eef135-755f-4ca2-a5f5-c336d1861a2e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6e7523a4-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519349, 'tstamp': 519349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253898, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap6e7523a4-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519351, 'tstamp': 519351}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253898, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.399 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e7523a4-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.402 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.404 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e7523a4-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.405 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.405 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000049 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.405 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000049 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.406 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e7523a4-f0, col_values=(('external_ids', {'iface-id': '73e18e36-18cb-4146-ae19-4b33b97a050a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.407 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.409 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 01b95c4f-9db6-469f-9458-8c279a5778f0 in datapath b36e9cab-12c6-4a09-9aab-ef2679d875ba unbound from our chassis#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.411 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000004c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.412 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000004c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.412 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b36e9cab-12c6-4a09-9aab-ef2679d875ba#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.423 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[55e26130-c95f-49ba-87aa-7d52248c4172]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.424 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb36e9cab-11 in ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.425 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb36e9cab-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.425 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7f45ee08-1081-4138-8d2e-47a1e714f875]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.427 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0c63b694-9e28-4768-bf37-bdac75b436a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.440 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9f556a-a8b8-4520-8162-607022071dba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.462 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e58c25f-fb3c-4b19-b71d-95dc6b732901]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.495 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[279ae25d-a398-406e-86fe-de103e968d10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.500 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[260efe53-2cda-41b3-be38-a425aba5c098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 NetworkManager[49076]: <info>  [1768920153.5013] manager: (tapb36e9cab-10): new Veth device (/org/freedesktop/NetworkManager/Devices/152)
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.526 226890 DEBUG nova.compute.manager [req-53cdb316-d57b-4233-b2cd-03f63bd10dd7 req-110f8ef9-1642-4098-88ce-eeb9ea0af1b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.528 226890 DEBUG oslo_concurrency.lockutils [req-53cdb316-d57b-4233-b2cd-03f63bd10dd7 req-110f8ef9-1642-4098-88ce-eeb9ea0af1b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.528 226890 DEBUG oslo_concurrency.lockutils [req-53cdb316-d57b-4233-b2cd-03f63bd10dd7 req-110f8ef9-1642-4098-88ce-eeb9ea0af1b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.529 226890 DEBUG oslo_concurrency.lockutils [req-53cdb316-d57b-4233-b2cd-03f63bd10dd7 req-110f8ef9-1642-4098-88ce-eeb9ea0af1b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.529 226890 DEBUG nova.compute.manager [req-53cdb316-d57b-4233-b2cd-03f63bd10dd7 req-110f8ef9-1642-4098-88ce-eeb9ea0af1b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Processing event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.529 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.530 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7c67d844-e8f8-486b-9731-c4a57799b321]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.533 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920153.5331566, 525b8695-a4df-46c5-875a-42d3b18b78be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.533 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.536 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.538 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6ca4c3-67a9-40ee-8253-221753130ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.542 226890 INFO nova.virt.libvirt.driver [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Instance spawned successfully.#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.542 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:42:33 np0005588920 NetworkManager[49076]: <info>  [1768920153.5709] device (tapb36e9cab-10): carrier: link connected
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.573 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.577 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.579 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4e6db3-fe11-4778-a427-db168a8586bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.583 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.583 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.584 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.584 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.585 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.585 226890 DEBUG nova.virt.libvirt.driver [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.604 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f32d0c44-9d1f-4c61-b050-8bb2698163d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb36e9cab-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:c2:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519525, 'reachable_time': 26406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253909, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.614 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.619 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e24220c-d348-4eb7-9c01-4f3583583a57]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:c252'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519525, 'tstamp': 519525}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253910, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.633 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.634 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4412MB free_disk=20.81378173828125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.634 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.635 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.638 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[639b314b-6002-4137-b59d-0f5cd383e645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb36e9cab-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:c2:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519525, 'reachable_time': 26406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253911, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.668 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[58a3e6fa-dee3-43ef-b451-0d124d3dfc49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.726 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5aecaf99-5e48-422c-b9e4-f2c49ae796bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.727 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb36e9cab-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.727 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.727 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb36e9cab-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 NetworkManager[49076]: <info>  [1768920153.7295] manager: (tapb36e9cab-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 20 09:42:33 np0005588920 kernel: tapb36e9cab-10: entered promiscuous mode
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.728 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.732 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb36e9cab-10, col_values=(('external_ids', {'iface-id': '5dcae274-b8f4-440a-a3eb-5c1a5a044346'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.733 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:33 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:33Z|00286|binding|INFO|Releasing lport 5dcae274-b8f4-440a-a3eb-5c1a5a044346 from this chassis (sb_readonly=0)
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.745 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 55628882-70c9-4b43-b653-9983ba87ca0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.745 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 525b8695-a4df-46c5-875a-42d3b18b78be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.746 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.746 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.749 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.748 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.750 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe1d96b-4b86-4323-95cb-6b574253cd31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.751 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b36e9cab-12c6-4a09-9aab-ef2679d875ba
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b36e9cab-12c6-4a09-9aab-ef2679d875ba
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:42:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:33.751 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'env', 'PROCESS_TAG=haproxy-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b36e9cab-12c6-4a09-9aab-ef2679d875ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.761 226890 INFO nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Took 15.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.761 226890 DEBUG nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:33 np0005588920 nova_compute[226886]: 2026-01-20 14:42:33.933 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:34.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.148 226890 INFO nova.compute.manager [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Took 16.74 seconds to build instance.#033[00m
Jan 20 09:42:34 np0005588920 podman[253962]: 2026-01-20 14:42:34.171477585 +0000 UTC m=+0.057798063 container create 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.214 226890 DEBUG oslo_concurrency.lockutils [None req-9d82a39f-bb98-4f91-8dfa-4d749e5111ba ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:34 np0005588920 systemd[1]: Started libpod-conmon-555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496.scope.
Jan 20 09:42:34 np0005588920 podman[253962]: 2026-01-20 14:42:34.139609226 +0000 UTC m=+0.025929744 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:42:34 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:42:34 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76b433b9b3697077276b8a6bd83a77b539ad94a8f90d450b2329161d85c9e966/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:42:34 np0005588920 podman[253962]: 2026-01-20 14:42:34.359785934 +0000 UTC m=+0.246106402 container init 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:42:34 np0005588920 podman[253962]: 2026-01-20 14:42:34.367558721 +0000 UTC m=+0.253879189 container start 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:42:34 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [NOTICE]   (253981) : New worker (253983) forked
Jan 20 09:42:34 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [NOTICE]   (253981) : Loading success.
Jan 20 09:42:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3903070332' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.438 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.444 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.480 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.539 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.539 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:34 np0005588920 nova_compute[226886]: 2026-01-20 14:42:34.997 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.012 226890 DEBUG nova.compute.manager [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.013 226890 DEBUG oslo_concurrency.lockutils [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.013 226890 DEBUG oslo_concurrency.lockutils [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.013 226890 DEBUG oslo_concurrency.lockutils [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.013 226890 DEBUG nova.compute.manager [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No event matching network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 in dict_keys([('network-vif-plugged', '2251427b-053e-477b-919c-0a2be96a4c01'), ('network-vif-plugged', '7eea52e4-93c4-48e5-9db5-b9d834c2bdbd')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.014 226890 WARNING nova.compute.manager [req-fcf539f3-6195-457d-b6aa-496809caef2b req-2c49bb5a-ab5c-48f6-9868-7078783c98b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:42:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:35.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.307 226890 DEBUG nova.compute.manager [req-85d7d5be-14a8-4f98-8f63-0f2b47ecd4fd req-a2c13b90-c48d-4ffa-8ca3-1f6682af2ac5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.308 226890 DEBUG oslo_concurrency.lockutils [req-85d7d5be-14a8-4f98-8f63-0f2b47ecd4fd req-a2c13b90-c48d-4ffa-8ca3-1f6682af2ac5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.308 226890 DEBUG oslo_concurrency.lockutils [req-85d7d5be-14a8-4f98-8f63-0f2b47ecd4fd req-a2c13b90-c48d-4ffa-8ca3-1f6682af2ac5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.308 226890 DEBUG oslo_concurrency.lockutils [req-85d7d5be-14a8-4f98-8f63-0f2b47ecd4fd req-a2c13b90-c48d-4ffa-8ca3-1f6682af2ac5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.308 226890 DEBUG nova.compute.manager [req-85d7d5be-14a8-4f98-8f63-0f2b47ecd4fd req-a2c13b90-c48d-4ffa-8ca3-1f6682af2ac5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Processing event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.540 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.541 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.541 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.689 226890 DEBUG nova.compute.manager [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.689 226890 DEBUG oslo_concurrency.lockutils [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.689 226890 DEBUG oslo_concurrency.lockutils [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.690 226890 DEBUG oslo_concurrency.lockutils [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.690 226890 DEBUG nova.compute.manager [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] No waiting events found dispatching network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.690 226890 WARNING nova.compute.manager [req-cfd1f2a7-a6c4-4053-bef8-6e3776185dad req-16d1bf9d-72b2-4dd1-a9a0-9e4a989ced8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received unexpected event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:42:35 np0005588920 nova_compute[226886]: 2026-01-20 14:42:35.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:36.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:37.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.436 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.437 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.438 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.439 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.439 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No event matching network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 in dict_keys([('network-vif-plugged', '7eea52e4-93c4-48e5-9db5-b9d834c2bdbd')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.440 226890 WARNING nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.440 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.441 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.442 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.442 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.443 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Processing event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.443 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.444 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.444 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.445 226890 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.446 226890 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.446 226890 WARNING nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.450 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance event wait completed in 5 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.456 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920157.4564848, 55628882-70c9-4b43-b653-9983ba87ca0d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.457 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.471 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.474 226890 INFO nova.virt.libvirt.driver [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance spawned successfully.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.475 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.491 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.494 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.503 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.503 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.504 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.504 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.505 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.505 226890 DEBUG nova.virt.libvirt.driver [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:37.515 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:37.516 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.545 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.681 226890 INFO nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Took 33.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.681 226890 DEBUG nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.950 226890 INFO nova.compute.manager [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Took 35.65 seconds to build instance.#033[00m
Jan 20 09:42:37 np0005588920 nova_compute[226886]: 2026-01-20 14:42:37.997 226890 DEBUG oslo_concurrency.lockutils [None req-b3453114-734c-4696-a753-b07ba35028ca 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 35.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:38.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:39.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.502 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.503 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.503 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.503 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.504 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.505 226890 INFO nova.compute.manager [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Terminating instance#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.505 226890 DEBUG nova.compute.manager [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:42:39 np0005588920 kernel: tap2251427b-05 (unregistering): left promiscuous mode
Jan 20 09:42:39 np0005588920 NetworkManager[49076]: <info>  [1768920159.5456] device (tap2251427b-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00287|binding|INFO|Releasing lport 2251427b-053e-477b-919c-0a2be96a4c01 from this chassis (sb_readonly=0)
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00288|binding|INFO|Setting lport 2251427b-053e-477b-919c-0a2be96a4c01 down in Southbound
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00289|binding|INFO|Removing iface tap2251427b-05 ovn-installed in OVS
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.555 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.564 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:1e:d5 10.100.0.171'], port_security=['fa:16:3e:89:1e:d5 10.100.0.171'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.171/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=409644b9-9c91-489f-870e-52aa4bf20678, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2251427b-053e-477b-919c-0a2be96a4c01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.565 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2251427b-053e-477b-919c-0a2be96a4c01 in datapath 6e7523a4-fd95-46c8-82f2-10c4527c1b7d unbound from our chassis#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.566 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e7523a4-fd95-46c8-82f2-10c4527c1b7d#033[00m
Jan 20 09:42:39 np0005588920 kernel: tap01ad691e-a0 (unregistering): left promiscuous mode
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.577 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 NetworkManager[49076]: <info>  [1768920159.5805] device (tap01ad691e-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.584 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[03c59be5-7e86-467e-8a51-f48bfb82edba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00290|binding|INFO|Releasing lport 01ad691e-a0f7-4b59-9b84-3486189332a5 from this chassis (sb_readonly=0)
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00291|binding|INFO|Setting lport 01ad691e-a0f7-4b59-9b84-3486189332a5 down in Southbound
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00292|binding|INFO|Removing iface tap01ad691e-a0 ovn-installed in OVS
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.591 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.593 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.599 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:02:77 10.100.1.236'], port_security=['fa:16:3e:54:02:77 10.100.1.236'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.236/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a896e9d-306e-4f99-81e1-986b217a807d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f570666c-7b28-45d1-80d8-f28c7296833d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=01ad691e-a0f7-4b59-9b84-3486189332a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:39 np0005588920 kernel: tap7eea52e4-93 (unregistering): left promiscuous mode
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.606 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 NetworkManager[49076]: <info>  [1768920159.6114] device (tap7eea52e4-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.618 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3da70d-bdda-45be-944f-dfec81190fbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00293|binding|INFO|Releasing lport 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd from this chassis (sb_readonly=0)
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00294|binding|INFO|Setting lport 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd down in Southbound
Jan 20 09:42:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:39Z|00295|binding|INFO|Removing iface tap7eea52e4-93 ovn-installed in OVS
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.621 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.623 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.621 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e027e22e-bc5f-49ef-a2f7-a399765101a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.631 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:cf:4c 10.100.0.107'], port_security=['fa:16:3e:5a:cf:4c 10.100.0.107'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.107/24', 'neutron:device_id': '55628882-70c9-4b43-b653-9983ba87ca0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=409644b9-9c91-489f-870e-52aa4bf20678, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.653 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d880bd-68e6-4ed7-b82d-fe8e08de1a80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 20 09:42:39 np0005588920 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000049.scope: Consumed 2.955s CPU time.
Jan 20 09:42:39 np0005588920 systemd-machined[196121]: Machine qemu-31-instance-00000049 terminated.
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.669 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e31a4691-334b-4a73-888d-4d92d2b44f64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e7523a4-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:93:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519337, 'reachable_time': 28108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254016, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.683 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9c1edb-61ca-4248-a822-64d6a63b297a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6e7523a4-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519349, 'tstamp': 519349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254017, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap6e7523a4-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519351, 'tstamp': 519351}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254017, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.685 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e7523a4-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.686 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.694 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.695 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e7523a4-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.695 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.696 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e7523a4-f0, col_values=(('external_ids', {'iface-id': '73e18e36-18cb-4146-ae19-4b33b97a050a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.696 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.697 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 01ad691e-a0f7-4b59-9b84-3486189332a5 in datapath 1a896e9d-306e-4f99-81e1-986b217a807d unbound from our chassis#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.698 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a896e9d-306e-4f99-81e1-986b217a807d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.699 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[77eb6f3e-748e-445a-87e4-8334fc56215b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.699 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d namespace which is not needed anymore#033[00m
Jan 20 09:42:39 np0005588920 NetworkManager[49076]: <info>  [1768920159.7351] manager: (tap01ad691e-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 20 09:42:39 np0005588920 NetworkManager[49076]: <info>  [1768920159.7492] manager: (tap7eea52e4-93): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.764 226890 INFO nova.virt.libvirt.driver [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Instance destroyed successfully.#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.765 226890 DEBUG nova.objects.instance [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'resources' on Instance uuid 55628882-70c9-4b43-b653-9983ba87ca0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.789 226890 DEBUG nova.virt.libvirt.vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:37Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.790 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "2251427b-053e-477b-919c-0a2be96a4c01", "address": "fa:16:3e:89:1e:d5", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2251427b-05", "ovs_interfaceid": "2251427b-053e-477b-919c-0a2be96a4c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.790 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.790 226890 DEBUG os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.792 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.793 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2251427b-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.794 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.797 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.805 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.807 226890 INFO os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:1e:d5,bridge_name='br-int',has_traffic_filtering=True,id=2251427b-053e-477b-919c-0a2be96a4c01,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2251427b-05')#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.808 226890 DEBUG nova.virt.libvirt.vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:37Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.808 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.809 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.809 226890 DEBUG os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.810 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.811 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01ad691e-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.814 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.816 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.817 226890 INFO os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:02:77,bridge_name='br-int',has_traffic_filtering=True,id=01ad691e-a0f7-4b59-9b84-3486189332a5,network=Network(1a896e9d-306e-4f99-81e1-986b217a807d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ad691e-a0')#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.818 226890 DEBUG nova.virt.libvirt.vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-74208518',display_name='tempest-ServersTestMultiNic-server-74208518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-74208518',id=73,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-so2diw9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:37Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=55628882-70c9-4b43-b653-9983ba87ca0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.819 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.819 226890 DEBUG nova.network.os_vif_util [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.820 226890 DEBUG os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.821 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.821 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7eea52e4-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.822 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.823 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.825 226890 INFO os_vif [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:cf:4c,bridge_name='br-int',has_traffic_filtering=True,id=7eea52e4-93c4-48e5-9db5-b9d834c2bdbd,network=Network(6e7523a4-fd95-46c8-82f2-10c4527c1b7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea52e4-93')#033[00m
Jan 20 09:42:39 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [NOTICE]   (253879) : haproxy version is 2.8.14-c23fe91
Jan 20 09:42:39 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [NOTICE]   (253879) : path to executable is /usr/sbin/haproxy
Jan 20 09:42:39 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [WARNING]  (253879) : Exiting Master process...
Jan 20 09:42:39 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [ALERT]    (253879) : Current worker (253881) exited with code 143 (Terminated)
Jan 20 09:42:39 np0005588920 neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d[253875]: [WARNING]  (253879) : All workers exited. Exiting... (0)
Jan 20 09:42:39 np0005588920 systemd[1]: libpod-c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96.scope: Deactivated successfully.
Jan 20 09:42:39 np0005588920 podman[254074]: 2026-01-20 14:42:39.845073524 +0000 UTC m=+0.054932302 container died c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:42:39 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96-userdata-shm.mount: Deactivated successfully.
Jan 20 09:42:39 np0005588920 systemd[1]: var-lib-containers-storage-overlay-b3a426077250f76807b457026b1e012c2576eb0c862b757e8d88db3d21154bb8-merged.mount: Deactivated successfully.
Jan 20 09:42:39 np0005588920 podman[254074]: 2026-01-20 14:42:39.878805275 +0000 UTC m=+0.088664053 container cleanup c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:42:39 np0005588920 systemd[1]: libpod-conmon-c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96.scope: Deactivated successfully.
Jan 20 09:42:39 np0005588920 podman[254120]: 2026-01-20 14:42:39.933753087 +0000 UTC m=+0.035800500 container remove c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.938 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[384422c8-6f1c-446d-8426-7088892d5235]: (4, ('Tue Jan 20 02:42:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d (c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96)\nc2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96\nTue Jan 20 02:42:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d (c2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96)\nc2baea4fcf241670bfe2b8c09e5695b06d25a378928cf222c6df0f7652b8ac96\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.940 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8b7d6d-2749-4ebb-8b7c-0c3c1d01d1bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.944 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a896e9d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 kernel: tap1a896e9d-30: left promiscuous mode
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.958 226890 DEBUG nova.compute.manager [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.958 226890 DEBUG oslo_concurrency.lockutils [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.958 226890 DEBUG oslo_concurrency.lockutils [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.959 226890 DEBUG oslo_concurrency.lockutils [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.959 226890 DEBUG nova.compute.manager [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-unplugged-01ad691e-a0f7-4b59-9b84-3486189332a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.959 226890 DEBUG nova.compute.manager [req-f999557f-de30-44f8-923d-5b0a29f81a82 req-49f1a9a7-48cd-479e-8f44-dd0959a274a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-01ad691e-a0f7-4b59-9b84-3486189332a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:42:39 np0005588920 nova_compute[226886]: 2026-01-20 14:42:39.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.965 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[988eee4b-3b2c-454c-b641-d99a5e4f63ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.985 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a5964551-b4ea-4093-8605-ab5843dd2e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:39.987 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb277f87-01e3-4a5e-becc-b7c7149b08b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.003 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8830ea96-f27f-48b2-a400-ad562af705d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519417, 'reachable_time': 16050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254136, 'error': None, 'target': 'ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 systemd[1]: run-netns-ovnmeta\x2d1a896e9d\x2d306e\x2d4f99\x2d81e1\x2d986b217a807d.mount: Deactivated successfully.
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.009 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a896e9d-306e-4f99-81e1-986b217a807d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.010 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbd5e37-385e-494d-ad9d-9f8efd61d66a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.011 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7eea52e4-93c4-48e5-9db5-b9d834c2bdbd in datapath 6e7523a4-fd95-46c8-82f2-10c4527c1b7d unbound from our chassis#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.013 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e7523a4-fd95-46c8-82f2-10c4527c1b7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.013 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9989111-e7f8-40d0-87a3-7031d80efb37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.014 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d namespace which is not needed anymore#033[00m
Jan 20 09:42:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:40.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:40 np0005588920 podman[254134]: 2026-01-20 14:42:40.078266815 +0000 UTC m=+0.082342196 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:42:40 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [NOTICE]   (253722) : haproxy version is 2.8.14-c23fe91
Jan 20 09:42:40 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [NOTICE]   (253722) : path to executable is /usr/sbin/haproxy
Jan 20 09:42:40 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [ALERT]    (253722) : Current worker (253724) exited with code 143 (Terminated)
Jan 20 09:42:40 np0005588920 neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d[253714]: [WARNING]  (253722) : All workers exited. Exiting... (0)
Jan 20 09:42:40 np0005588920 systemd[1]: libpod-a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c.scope: Deactivated successfully.
Jan 20 09:42:40 np0005588920 conmon[253714]: conmon a160ddd29c1b79bd7d7f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c.scope/container/memory.events
Jan 20 09:42:40 np0005588920 podman[254173]: 2026-01-20 14:42:40.155696414 +0000 UTC m=+0.055543489 container died a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.182 226890 DEBUG nova.compute.manager [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.183 226890 DEBUG oslo_concurrency.lockutils [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.183 226890 DEBUG oslo_concurrency.lockutils [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.184 226890 DEBUG oslo_concurrency.lockutils [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.184 226890 DEBUG nova.compute.manager [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-unplugged-2251427b-053e-477b-919c-0a2be96a4c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.185 226890 DEBUG nova.compute.manager [req-3864951c-7244-480a-bb57-1db8b3e12227 req-142f3873-1a55-4913-ad7f-991c766c3ac7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-2251427b-053e-477b-919c-0a2be96a4c01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c-userdata-shm.mount: Deactivated successfully.
Jan 20 09:42:40 np0005588920 systemd[1]: var-lib-containers-storage-overlay-1b60ffe80042a32b7f20eeef822902410f0f9eb9b9517bf405b0a09859f02306-merged.mount: Deactivated successfully.
Jan 20 09:42:40 np0005588920 podman[254173]: 2026-01-20 14:42:40.223265958 +0000 UTC m=+0.123113033 container cleanup a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:42:40 np0005588920 systemd[1]: libpod-conmon-a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c.scope: Deactivated successfully.
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.252 226890 INFO nova.virt.libvirt.driver [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Deleting instance files /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d_del#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.253 226890 INFO nova.virt.libvirt.driver [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Deletion of /var/lib/nova/instances/55628882-70c9-4b43-b653-9983ba87ca0d_del complete#033[00m
Jan 20 09:42:40 np0005588920 podman[254202]: 2026-01-20 14:42:40.278115957 +0000 UTC m=+0.036074747 container remove a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.284 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e09ac76a-b5f9-48bc-9efa-e87437498d7d]: (4, ('Tue Jan 20 02:42:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d (a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c)\na160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c\nTue Jan 20 02:42:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d (a160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c)\na160ddd29c1b79bd7d7f22cc94af84c95f286b5c2b34ceca96c5fb30e9d9b45c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.286 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2483a537-5271-44e5-a543-c31705341606]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.287 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e7523a4-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.289 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588920 kernel: tap6e7523a4-f0: left promiscuous mode
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.305 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc387cd-4066-4f80-a807-6254cd0e6c53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.332 226890 INFO nova.compute.manager [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.333 226890 DEBUG oslo.service.loopingcall [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.333 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2d2440-7fac-49ea-963d-76d76aaaef0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.334 226890 DEBUG nova.compute.manager [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:42:40 np0005588920 nova_compute[226886]: 2026-01-20 14:42:40.334 226890 DEBUG nova.network.neutron [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.335 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f42062f-9425-4a84-9c22-b870ac2be04e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.348 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0c26eef6-83bc-4b18-89ae-5f943f9aa2ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519328, 'reachable_time': 43034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254217, 'error': None, 'target': 'ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.350 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6e7523a4-fd95-46c8-82f2-10c4527c1b7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:42:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:40.350 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[b2239aca-9446-420a-8d35-c41c7ccc2417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:40 np0005588920 systemd[1]: run-netns-ovnmeta\x2d6e7523a4\x2dfd95\x2d46c8\x2d82f2\x2d10c4527c1b7d.mount: Deactivated successfully.
Jan 20 09:42:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:42:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:41.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:42:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:42.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.307 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.308 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.309 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.309 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.310 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.311 226890 WARNING nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-2251427b-053e-477b-919c-0a2be96a4c01 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.311 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.312 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.312 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.312 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.313 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-unplugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.313 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-unplugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.314 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.314 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.315 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.316 226890 DEBUG oslo_concurrency.lockutils [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.316 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.316 226890 WARNING nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.317 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-deleted-2251427b-053e-477b-919c-0a2be96a4c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.317 226890 INFO nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Neutron deleted interface 2251427b-053e-477b-919c-0a2be96a4c01; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.318 226890 DEBUG nova.network.neutron [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [{"id": "01ad691e-a0f7-4b59-9b84-3486189332a5", "address": "fa:16:3e:54:02:77", "network": {"id": "1a896e9d-306e-4f99-81e1-986b217a807d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1473946507", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.236", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ad691e-a0", "ovs_interfaceid": "01ad691e-a0f7-4b59-9b84-3486189332a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.361 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Detach interface failed, port_id=2251427b-053e-477b-919c-0a2be96a4c01, reason: Instance 55628882-70c9-4b43-b653-9983ba87ca0d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.362 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-deleted-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.362 226890 INFO nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Neutron deleted interface 01ad691e-a0f7-4b59-9b84-3486189332a5; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.363 226890 DEBUG nova.network.neutron [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [{"id": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "address": "fa:16:3e:5a:cf:4c", "network": {"id": "6e7523a4-fd95-46c8-82f2-10c4527c1b7d", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1506530564", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.107", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea52e4-93", "ovs_interfaceid": "7eea52e4-93c4-48e5-9db5-b9d834c2bdbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.383 226890 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Detach interface failed, port_id=01ad691e-a0f7-4b59-9b84-3486189332a5, reason: Instance 55628882-70c9-4b43-b653-9983ba87ca0d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.440 226890 DEBUG nova.compute.manager [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.440 226890 DEBUG oslo_concurrency.lockutils [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.440 226890 DEBUG oslo_concurrency.lockutils [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.441 226890 DEBUG oslo_concurrency.lockutils [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.441 226890 DEBUG nova.compute.manager [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] No waiting events found dispatching network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.441 226890 WARNING nova.compute.manager [req-8d643c84-8a26-4a18-abad-4e41b93da3e4 req-2de7f05c-3304-46c7-89b6-9a9226a77e0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received unexpected event network-vif-plugged-01ad691e-a0f7-4b59-9b84-3486189332a5 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.743 226890 DEBUG nova.network.neutron [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:42 np0005588920 nova_compute[226886]: 2026-01-20 14:42:42.813 226890 INFO nova.compute.manager [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Took 2.48 seconds to deallocate network for instance.#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.033 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.034 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.150 226890 DEBUG oslo_concurrency.processutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:42:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:43.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:42:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1080104359' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.631 226890 DEBUG oslo_concurrency.processutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.641 226890 DEBUG nova.compute.provider_tree [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.707 226890 DEBUG nova.scheduler.client.report [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.791 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.825 226890 INFO nova.scheduler.client.report [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Deleted allocations for instance 55628882-70c9-4b43-b653-9983ba87ca0d#033[00m
Jan 20 09:42:43 np0005588920 nova_compute[226886]: 2026-01-20 14:42:43.978 226890 DEBUG oslo_concurrency.lockutils [None req-0f20f07f-129e-4155-b447-3f020eaa7538 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "55628882-70c9-4b43-b653-9983ba87ca0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:44.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:44 np0005588920 nova_compute[226886]: 2026-01-20 14:42:44.442 226890 DEBUG nova.compute.manager [req-cbca2c07-3c0c-459e-ba9c-3cbcd66be508 req-bc2e1d81-e6b6-4811-8439-898bec189b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Received event network-vif-deleted-7eea52e4-93c4-48e5-9db5-b9d834c2bdbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:42:44.518 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:44 np0005588920 nova_compute[226886]: 2026-01-20 14:42:44.824 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:45 np0005588920 nova_compute[226886]: 2026-01-20 14:42:45.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:45 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 20 09:42:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:45.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:45 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:45Z|00296|binding|INFO|Releasing lport 5dcae274-b8f4-440a-a3eb-5c1a5a044346 from this chassis (sb_readonly=0)
Jan 20 09:42:45 np0005588920 nova_compute[226886]: 2026-01-20 14:42:45.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:46.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:47Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:80:84 10.100.0.14
Jan 20 09:42:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:42:47Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:80:84 10.100.0.14
Jan 20 09:42:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:47.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:48.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:42:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:42:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:49.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:49 np0005588920 nova_compute[226886]: 2026-01-20 14:42:49.828 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:50.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:50 np0005588920 nova_compute[226886]: 2026-01-20 14:42:50.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:51.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:52.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:53.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:54.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:54 np0005588920 nova_compute[226886]: 2026-01-20 14:42:54.762 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920159.760048, 55628882-70c9-4b43-b653-9983ba87ca0d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:54 np0005588920 nova_compute[226886]: 2026-01-20 14:42:54.763 226890 INFO nova.compute.manager [-] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:42:54 np0005588920 nova_compute[226886]: 2026-01-20 14:42:54.795 226890 DEBUG nova.compute.manager [None req-61523821-029a-4ed5-bb01-cdd5c5db9c57 - - - - - -] [instance: 55628882-70c9-4b43-b653-9983ba87ca0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:54 np0005588920 nova_compute[226886]: 2026-01-20 14:42:54.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:55 np0005588920 nova_compute[226886]: 2026-01-20 14:42:55.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:55.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:56.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:57.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:58.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.678 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.679 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.707 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.856 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.857 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.873 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:42:58 np0005588920 nova_compute[226886]: 2026-01-20 14:42:58.873 226890 INFO nova.compute.claims [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:42:58 np0005588920 podman[254422]: 2026-01-20 14:42:58.995008032 +0000 UTC m=+0.083802127 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.014 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:42:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:59.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/306700131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.442 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.450 226890 DEBUG nova.compute.provider_tree [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.465 226890 DEBUG nova.scheduler.client.report [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.489 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.490 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.558 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.559 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.580 226890 INFO nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.601 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.750 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.751 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.752 226890 INFO nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Creating image(s)#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.777 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.802 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.826 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.830 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.864 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.869 226890 DEBUG nova.policy [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a00517a957e4ceb8564cbf1dfa15ee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13c0d93976f745dba4ab050770ccaae6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.921 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.922 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.923 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.923 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.949 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:59 np0005588920 nova_compute[226886]: 2026-01-20 14:42:59.952 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8cd92082-94e5-46f3-992f-afb6b04a3801_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:00.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.250 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 8cd92082-94e5-46f3-992f-afb6b04a3801_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.318 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] resizing rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.422 226890 DEBUG nova.objects.instance [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'migration_context' on Instance uuid 8cd92082-94e5-46f3-992f-afb6b04a3801 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.437 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.437 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Ensure instance console log exists: /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.438 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.438 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.438 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:00 np0005588920 nova_compute[226886]: 2026-01-20 14:43:00.655 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Successfully created port: 531c124f-8aea-41ea-bdb8-428b944ad2a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:01.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:01 np0005588920 nova_compute[226886]: 2026-01-20 14:43:01.405 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Successfully created port: 9e71fac0-8593-43ea-8364-0ad2c9111d03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:02.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.610 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.610 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.610 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.611 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.611 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.612 226890 INFO nova.compute.manager [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Terminating instance#033[00m
Jan 20 09:43:02 np0005588920 nova_compute[226886]: 2026-01-20 14:43:02.613 226890 DEBUG nova.compute.manager [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:43:03 np0005588920 kernel: tap01b95c4f-9d (unregistering): left promiscuous mode
Jan 20 09:43:03 np0005588920 NetworkManager[49076]: <info>  [1768920183.0046] device (tap01b95c4f-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:03Z|00297|binding|INFO|Releasing lport 01b95c4f-9db6-469f-9458-8c279a5778f0 from this chassis (sb_readonly=0)
Jan 20 09:43:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:03Z|00298|binding|INFO|Setting lport 01b95c4f-9db6-469f-9458-8c279a5778f0 down in Southbound
Jan 20 09:43:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:03Z|00299|binding|INFO|Removing iface tap01b95c4f-9d ovn-installed in OVS
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.014 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.026 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:80:84 10.100.0.14'], port_security=['fa:16:3e:d7:80:84 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '525b8695-a4df-46c5-875a-42d3b18b78be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b95747114ab4043b93a260387199c91', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f18b0222-78a5-4c37-8065-772dbe5c63e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80e2aa5b-ecb8-4e93-992f-baaef718dd34, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=01b95c4f-9db6-469f-9458-8c279a5778f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.028 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 01b95c4f-9db6-469f-9458-8c279a5778f0 in datapath b36e9cab-12c6-4a09-9aab-ef2679d875ba unbound from our chassis#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.029 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b36e9cab-12c6-4a09-9aab-ef2679d875ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.030 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0484ea5e-fd96-4322-86ef-06be1c7ab62e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.030 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba namespace which is not needed anymore#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.037 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 20 09:43:03 np0005588920 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004c.scope: Consumed 13.741s CPU time.
Jan 20 09:43:03 np0005588920 systemd-machined[196121]: Machine qemu-32-instance-0000004c terminated.
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [NOTICE]   (253981) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [NOTICE]   (253981) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [WARNING]  (253981) : Exiting Master process...
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [WARNING]  (253981) : Exiting Master process...
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [ALERT]    (253981) : Current worker (253983) exited with code 143 (Terminated)
Jan 20 09:43:03 np0005588920 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[253977]: [WARNING]  (253981) : All workers exited. Exiting... (0)
Jan 20 09:43:03 np0005588920 systemd[1]: libpod-555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496.scope: Deactivated successfully.
Jan 20 09:43:03 np0005588920 podman[254662]: 2026-01-20 14:43:03.204320072 +0000 UTC m=+0.069308133 container died 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.252 226890 INFO nova.virt.libvirt.driver [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Instance destroyed successfully.#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.252 226890 DEBUG nova.objects.instance [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'resources' on Instance uuid 525b8695-a4df-46c5-875a-42d3b18b78be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:03 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:03 np0005588920 systemd[1]: var-lib-containers-storage-overlay-76b433b9b3697077276b8a6bd83a77b539ad94a8f90d450b2329161d85c9e966-merged.mount: Deactivated successfully.
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.266 226890 DEBUG nova.virt.libvirt.vif [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:42:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1173241717',display_name='tempest-ListServerFiltersTestJSON-instance-1173241717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1173241717',id=76,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-m1qpexqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:33Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=525b8695-a4df-46c5-875a-42d3b18b78be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.266 226890 DEBUG nova.network.os_vif_util [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "01b95c4f-9db6-469f-9458-8c279a5778f0", "address": "fa:16:3e:d7:80:84", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01b95c4f-9d", "ovs_interfaceid": "01b95c4f-9db6-469f-9458-8c279a5778f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.267 226890 DEBUG nova.network.os_vif_util [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.268 226890 DEBUG os_vif [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.269 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.269 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01b95c4f-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.273 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.275 226890 INFO os_vif [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:80:84,bridge_name='br-int',has_traffic_filtering=True,id=01b95c4f-9db6-469f-9458-8c279a5778f0,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01b95c4f-9d')#033[00m
Jan 20 09:43:03 np0005588920 podman[254662]: 2026-01-20 14:43:03.286345198 +0000 UTC m=+0.151333239 container cleanup 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:43:03 np0005588920 systemd[1]: libpod-conmon-555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496.scope: Deactivated successfully.
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.316 226890 DEBUG nova.compute.manager [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-unplugged-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.316 226890 DEBUG oslo_concurrency.lockutils [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.316 226890 DEBUG oslo_concurrency.lockutils [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.317 226890 DEBUG oslo_concurrency.lockutils [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.317 226890 DEBUG nova.compute.manager [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] No waiting events found dispatching network-vif-unplugged-01b95c4f-9db6-469f-9458-8c279a5778f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.318 226890 DEBUG nova.compute.manager [req-ce7d516b-6838-46c9-a3c6-1f0a56f50c6d req-66c9f0a0-3c8a-4090-b5fe-f9cf4907e48e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-unplugged-01b95c4f-9db6-469f-9458-8c279a5778f0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:03.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:03 np0005588920 podman[254716]: 2026-01-20 14:43:03.34416203 +0000 UTC m=+0.039341288 container remove 555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.349 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[81cb0af6-4da9-4c8d-beab-0baaded8cfcc]: (4, ('Tue Jan 20 02:43:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba (555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496)\n555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496\nTue Jan 20 02:43:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba (555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496)\n555298e7c120e880b2f02f1d59ccbc1715a04331944ba049a623571d23408496\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.350 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[76c65212-eb0e-4233-be4c-ff964bb073f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.351 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb36e9cab-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:03 np0005588920 kernel: tapb36e9cab-10: left promiscuous mode
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 nova_compute[226886]: 2026-01-20 14:43:03.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.368 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[86bc442c-262e-42ba-b664-87e1516d3fff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.382 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b07eefc9-f1d6-4cf8-b599-797449983a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.383 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7c05de05-8543-4256-83ac-57bab86f01f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.397 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7312f9a1-48cd-4e46-b813-8cdbb06fd921]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519517, 'reachable_time': 44526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254737, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.400 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:03.400 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[84fa19f7-f2d2-4142-b7c8-5bb4abfb5959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:03 np0005588920 systemd[1]: run-netns-ovnmeta\x2db36e9cab\x2d12c6\x2d4a09\x2d9aab\x2def2679d875ba.mount: Deactivated successfully.
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.029 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Successfully updated port: 531c124f-8aea-41ea-bdb8-428b944ad2a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:04.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.102 226890 DEBUG nova.compute.manager [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-changed-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.103 226890 DEBUG nova.compute.manager [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Refreshing instance network info cache due to event network-changed-531c124f-8aea-41ea-bdb8-428b944ad2a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.103 226890 DEBUG oslo_concurrency.lockutils [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.103 226890 DEBUG oslo_concurrency.lockutils [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.104 226890 DEBUG nova.network.neutron [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Refreshing network info cache for port 531c124f-8aea-41ea-bdb8-428b944ad2a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.143 226890 INFO nova.virt.libvirt.driver [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Deleting instance files /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be_del#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.144 226890 INFO nova.virt.libvirt.driver [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Deletion of /var/lib/nova/instances/525b8695-a4df-46c5-875a-42d3b18b78be_del complete#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.209 226890 INFO nova.compute.manager [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Took 1.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.210 226890 DEBUG oslo.service.loopingcall [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.210 226890 DEBUG nova.compute.manager [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.210 226890 DEBUG nova.network.neutron [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.308 226890 DEBUG nova.network.neutron [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.853 226890 DEBUG nova.network.neutron [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:04 np0005588920 nova_compute[226886]: 2026-01-20 14:43:04.870 226890 DEBUG oslo_concurrency.lockutils [req-452c4bf1-9f8a-410a-8164-2b922c913380 req-5c856a1d-ea9f-4e1c-b41a-968ca56ed1ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.285 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Successfully updated port: 9e71fac0-8593-43ea-8364-0ad2c9111d03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.305 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.305 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquired lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.306 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:43:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:05.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.437 226890 DEBUG nova.network.neutron [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.447 226890 DEBUG nova.compute.manager [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.447 226890 DEBUG oslo_concurrency.lockutils [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.448 226890 DEBUG oslo_concurrency.lockutils [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.448 226890 DEBUG oslo_concurrency.lockutils [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.448 226890 DEBUG nova.compute.manager [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] No waiting events found dispatching network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.449 226890 WARNING nova.compute.manager [req-cea34333-31cb-45d4-983f-517a3c58ddae req-e9a86320-43e5-4d57-b110-183e212d8dc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received unexpected event network-vif-plugged-01b95c4f-9db6-469f-9458-8c279a5778f0 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.462 226890 INFO nova.compute.manager [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.491 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.535 226890 DEBUG nova.compute.manager [req-d3b1e074-28ae-422a-80a6-6428ff20acd4 req-1d8b99f7-bd84-44d8-ae02-49b0340f5092 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Received event network-vif-deleted-01b95c4f-9db6-469f-9458-8c279a5778f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.551 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.552 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:05 np0005588920 nova_compute[226886]: 2026-01-20 14:43:05.626 226890 DEBUG oslo_concurrency.processutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2177212168' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:06.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.064 226890 DEBUG oslo_concurrency.processutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.070 226890 DEBUG nova.compute.provider_tree [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.086 226890 DEBUG nova.scheduler.client.report [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.121 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.184 226890 INFO nova.scheduler.client.report [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Deleted allocations for instance 525b8695-a4df-46c5-875a-42d3b18b78be#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.272 226890 DEBUG oslo_concurrency.lockutils [None req-67345773-3388-4d5c-a46d-8a0a0617a0d4 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "525b8695-a4df-46c5-875a-42d3b18b78be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.607 226890 DEBUG nova.compute.manager [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-changed-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.607 226890 DEBUG nova.compute.manager [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Refreshing instance network info cache due to event network-changed-9e71fac0-8593-43ea-8364-0ad2c9111d03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:06 np0005588920 nova_compute[226886]: 2026-01-20 14:43:06.608 226890 DEBUG oslo_concurrency.lockutils [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:07.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:08.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.542 226890 DEBUG nova.network.neutron [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Updating instance_info_cache with network_info: [{"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.568 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Releasing lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.568 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance network_info: |[{"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.569 226890 DEBUG oslo_concurrency.lockutils [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.569 226890 DEBUG nova.network.neutron [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Refreshing network info cache for port 9e71fac0-8593-43ea-8364-0ad2c9111d03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.574 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Start _get_guest_xml network_info=[{"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.579 226890 WARNING nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.583 226890 DEBUG nova.virt.libvirt.host [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.584 226890 DEBUG nova.virt.libvirt.host [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.587 226890 DEBUG nova.virt.libvirt.host [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.588 226890 DEBUG nova.virt.libvirt.host [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.589 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.589 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.590 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.590 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.590 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.590 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.591 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.591 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.591 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.591 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.592 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.592 226890 DEBUG nova.virt.hardware [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:43:08 np0005588920 nova_compute[226886]: 2026-01-20 14:43:08.595 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1530503608' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.007 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.041 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.047 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:09.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1857567897' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.464 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.466 226890 DEBUG nova.virt.libvirt.vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:59Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.466 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.467 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.468 226890 DEBUG nova.virt.libvirt.vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:59Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.468 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.469 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.470 226890 DEBUG nova.objects.instance [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8cd92082-94e5-46f3-992f-afb6b04a3801 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.482 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <uuid>8cd92082-94e5-46f3-992f-afb6b04a3801</uuid>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <name>instance-0000004f</name>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersTestMultiNic-server-1117384226</nova:name>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:43:08</nova:creationTime>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:user uuid="6a00517a957e4ceb8564cbf1dfa15ee2">tempest-ServersTestMultiNic-1634662961-project-member</nova:user>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:project uuid="13c0d93976f745dba4ab050770ccaae6">tempest-ServersTestMultiNic-1634662961</nova:project>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:port uuid="531c124f-8aea-41ea-bdb8-428b944ad2a5">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.222" ipVersion="4"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <nova:port uuid="9e71fac0-8593-43ea-8364-0ad2c9111d03">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.1.39" ipVersion="4"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="serial">8cd92082-94e5-46f3-992f-afb6b04a3801</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="uuid">8cd92082-94e5-46f3-992f-afb6b04a3801</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8cd92082-94e5-46f3-992f-afb6b04a3801_disk">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cf:24:d8"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <target dev="tap531c124f-8a"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cd:42:db"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <target dev="tap9e71fac0-85"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/console.log" append="off"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:43:09 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:43:09 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:43:09 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:43:09 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.484 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Preparing to wait for external event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.484 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.485 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.485 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.485 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Preparing to wait for external event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.486 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.486 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.486 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.487 226890 DEBUG nova.virt.libvirt.vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:59Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.487 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.488 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.489 226890 DEBUG os_vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.490 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.490 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.494 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.494 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap531c124f-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.494 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap531c124f-8a, col_values=(('external_ids', {'iface-id': '531c124f-8aea-41ea-bdb8-428b944ad2a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:24:d8', 'vm-uuid': '8cd92082-94e5-46f3-992f-afb6b04a3801'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.496 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 NetworkManager[49076]: <info>  [1768920189.4977] manager: (tap531c124f-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.499 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.504 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.505 226890 INFO os_vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a')#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.506 226890 DEBUG nova.virt.libvirt.vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:59Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.506 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.507 226890 DEBUG nova.network.os_vif_util [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.507 226890 DEBUG os_vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.508 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.508 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.508 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.511 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e71fac0-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.512 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e71fac0-85, col_values=(('external_ids', {'iface-id': '9e71fac0-8593-43ea-8364-0ad2c9111d03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:42:db', 'vm-uuid': '8cd92082-94e5-46f3-992f-afb6b04a3801'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 NetworkManager[49076]: <info>  [1768920189.5147] manager: (tap9e71fac0-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.524 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.526 226890 INFO os_vif [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85')#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.606 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.607 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.607 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No VIF found with MAC fa:16:3e:cf:24:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.607 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] No VIF found with MAC fa:16:3e:cd:42:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.608 226890 INFO nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Using config drive#033[00m
Jan 20 09:43:09 np0005588920 nova_compute[226886]: 2026-01-20 14:43:09.632 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:10.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.104 226890 INFO nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Creating config drive at /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.110 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv5fhicbj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.225 226890 DEBUG nova.network.neutron [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Updated VIF entry in instance network info cache for port 9e71fac0-8593-43ea-8364-0ad2c9111d03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.226 226890 DEBUG nova.network.neutron [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Updating instance_info_cache with network_info: [{"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.243 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv5fhicbj" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.272 226890 DEBUG nova.storage.rbd_utils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] rbd image 8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.276 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config 8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.301 226890 DEBUG oslo_concurrency.lockutils [req-1f772236-9ace-43b6-b962-50712de61acf req-d4db3aa9-b981-44f2-bcce-e20d50a127fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-8cd92082-94e5-46f3-992f-afb6b04a3801" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.727 226890 DEBUG oslo_concurrency.processutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config 8cd92082-94e5-46f3-992f-afb6b04a3801_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.727 226890 INFO nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Deleting local config drive /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801/disk.config because it was imported into RBD.#033[00m
Jan 20 09:43:10 np0005588920 kernel: tap531c124f-8a: entered promiscuous mode
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.7829] manager: (tap531c124f-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00300|binding|INFO|Claiming lport 531c124f-8aea-41ea-bdb8-428b944ad2a5 for this chassis.
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00301|binding|INFO|531c124f-8aea-41ea-bdb8-428b944ad2a5: Claiming fa:16:3e:cf:24:d8 10.100.0.222
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.799 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:24:d8 10.100.0.222'], port_security=['fa:16:3e:cf:24:d8 10.100.0.222'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.222/24', 'neutron:device_id': '8cd92082-94e5-46f3-992f-afb6b04a3801', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b22412b1-dc16-4643-ab31-b29b37d280c4, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=531c124f-8aea-41ea-bdb8-428b944ad2a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8010] manager: (tap9e71fac0-85): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.801 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 531c124f-8aea-41ea-bdb8-428b944ad2a5 in datapath 49c2fa90-1bfc-42c2-8a83-e0a4d3888265 bound to our chassis#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.803 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 49c2fa90-1bfc-42c2-8a83-e0a4d3888265#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.815 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8eace9-e57b-4555-9fe3-f8d105ea3dd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.816 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap49c2fa90-11 in ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.818 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap49c2fa90-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.818 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[580b63c8-e415-4f64-8ad8-c737b43f7273]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.819 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[57142e7e-dfc2-4a5e-9cac-92069039c9b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 kernel: tap9e71fac0-85: entered promiscuous mode
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.826 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00302|binding|INFO|Claiming lport 9e71fac0-8593-43ea-8364-0ad2c9111d03 for this chassis.
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00303|binding|INFO|9e71fac0-8593-43ea-8364-0ad2c9111d03: Claiming fa:16:3e:cd:42:db 10.100.1.39
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.830 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[643a151e-8e97-4550-8388-08bc4b8c7ebb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 systemd-machined[196121]: New machine qemu-33-instance-0000004f.
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.833 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00304|binding|INFO|Setting lport 531c124f-8aea-41ea-bdb8-428b944ad2a5 ovn-installed in OVS
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00305|binding|INFO|Setting lport 531c124f-8aea-41ea-bdb8-428b944ad2a5 up in Southbound
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.837 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:42:db 10.100.1.39'], port_security=['fa:16:3e:cd:42:db 10.100.1.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.39/24', 'neutron:device_id': '8cd92082-94e5-46f3-992f-afb6b04a3801', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da894a36-fdaf-403d-9ea0-4526779be5e4, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=9e71fac0-8593-43ea-8364-0ad2c9111d03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 systemd[1]: Started Virtual Machine qemu-33-instance-0000004f.
Jan 20 09:43:10 np0005588920 systemd-udevd[254914]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:10 np0005588920 systemd-udevd[254915]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.863 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b0165877-4835-45ce-a1ab-8524c18afd0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8668] device (tap531c124f-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8673] device (tap9e71fac0-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8679] device (tap531c124f-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8684] device (tap9e71fac0-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00306|binding|INFO|Setting lport 9e71fac0-8593-43ea-8364-0ad2c9111d03 ovn-installed in OVS
Jan 20 09:43:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:10Z|00307|binding|INFO|Setting lport 9e71fac0-8593-43ea-8364-0ad2c9111d03 up in Southbound
Jan 20 09:43:10 np0005588920 nova_compute[226886]: 2026-01-20 14:43:10.878 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.890 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d170e46f-a497-43a9-b9ef-c706947d8e61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 systemd-udevd[254920]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.8979] manager: (tap49c2fa90-10): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.897 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[538a7245-b60a-43b3-88b6-d2c40820b779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 podman[254896]: 2026-01-20 14:43:10.915340243 +0000 UTC m=+0.099318280 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.931 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7dc714-b477-4317-8534-60ea3feca20f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.934 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3c3d87-743d-4fdc-9677-197185badd6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 NetworkManager[49076]: <info>  [1768920190.9559] device (tap49c2fa90-10): carrier: link connected
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.961 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b47c88-c307-4372-9b57-1b9deb0475f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.977 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[56b12012-6c2e-45a5-920e-00ca1dd2d22a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap49c2fa90-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:3e:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523264, 'reachable_time': 39802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254951, 'error': None, 'target': 'ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:10.991 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba26c766-6056-434d-9156-dc8bc9817fb3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:3e81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523264, 'tstamp': 523264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254952, 'error': None, 'target': 'ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.009 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdf028d-7fe7-46b5-95fb-4a3dde2068a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap49c2fa90-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:3e:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523264, 'reachable_time': 39802, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254953, 'error': None, 'target': 'ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.036 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c43fe271-bd37-4593-b5dd-e00a512f3ee6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.095 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c72c8a-2725-4ac5-a1c1-4b05ea387afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.096 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49c2fa90-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.097 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.097 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49c2fa90-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:11 np0005588920 NetworkManager[49076]: <info>  [1768920191.1298] manager: (tap49c2fa90-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 20 09:43:11 np0005588920 kernel: tap49c2fa90-10: entered promiscuous mode
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.129 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.133 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap49c2fa90-10, col_values=(('external_ids', {'iface-id': 'cbf41c99-774d-406a-ab32-8286ac9ca6da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:11 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:11Z|00308|binding|INFO|Releasing lport cbf41c99-774d-406a-ab32-8286ac9ca6da from this chassis (sb_readonly=0)
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.135 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.136 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/49c2fa90-1bfc-42c2-8a83-e0a4d3888265.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/49c2fa90-1bfc-42c2-8a83-e0a4d3888265.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.137 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac353ec-019d-454b-99f5-b794f5cfc819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.138 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-49c2fa90-1bfc-42c2-8a83-e0a4d3888265
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/49c2fa90-1bfc-42c2-8a83-e0a4d3888265.pid.haproxy
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 49c2fa90-1bfc-42c2-8a83-e0a4d3888265
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:43:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:11.138 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'env', 'PROCESS_TAG=haproxy-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/49c2fa90-1bfc-42c2-8a83-e0a4d3888265.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:11.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:11 np0005588920 podman[254985]: 2026-01-20 14:43:11.43399544 +0000 UTC m=+0.019393111 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.925 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920191.924307, 8cd92082-94e5-46f3-992f-afb6b04a3801 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.926 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] VM Started (Lifecycle Event)#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.949 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.953 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920191.9248707, 8cd92082-94e5-46f3-992f-afb6b04a3801 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.953 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.971 226890 DEBUG nova.compute.manager [req-fab0b556-8429-4440-bd1e-42069968c3b8 req-dcc9152c-9612-4dd7-b1fd-72b8255220cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.972 226890 DEBUG oslo_concurrency.lockutils [req-fab0b556-8429-4440-bd1e-42069968c3b8 req-dcc9152c-9612-4dd7-b1fd-72b8255220cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.972 226890 DEBUG oslo_concurrency.lockutils [req-fab0b556-8429-4440-bd1e-42069968c3b8 req-dcc9152c-9612-4dd7-b1fd-72b8255220cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.972 226890 DEBUG oslo_concurrency.lockutils [req-fab0b556-8429-4440-bd1e-42069968c3b8 req-dcc9152c-9612-4dd7-b1fd-72b8255220cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.973 226890 DEBUG nova.compute.manager [req-fab0b556-8429-4440-bd1e-42069968c3b8 req-dcc9152c-9612-4dd7-b1fd-72b8255220cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Processing event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.974 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.977 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:11 np0005588920 nova_compute[226886]: 2026-01-20 14:43:11.996 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:12.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:12 np0005588920 podman[254985]: 2026-01-20 14:43:12.07141864 +0000 UTC m=+0.656816291 container create 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 09:43:12 np0005588920 systemd[1]: Started libpod-conmon-3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e.scope.
Jan 20 09:43:12 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:43:12 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f4071adb58cfc4531fdaa1d1fb7e2f5bb643a655903f81b48c412b1a6012383/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.614 226890 DEBUG nova.compute.manager [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.615 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.616 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.616 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.617 226890 DEBUG nova.compute.manager [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Processing event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.617 226890 DEBUG nova.compute.manager [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.618 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.618 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.618 226890 DEBUG oslo_concurrency.lockutils [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.619 226890 DEBUG nova.compute.manager [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.619 226890 WARNING nova.compute.manager [req-1c1a303d-266b-43d8-ba6e-07d16209eeaf req-ee8aacef-b97e-4cfe-a516-541dc56549fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received unexpected event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.620 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.624 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.625 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920192.624161, 8cd92082-94e5-46f3-992f-afb6b04a3801 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.626 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.631 226890 INFO nova.virt.libvirt.driver [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance spawned successfully.#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.632 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.655 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.666 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:12 np0005588920 podman[254985]: 2026-01-20 14:43:12.665761807 +0000 UTC m=+1.251159508 container init 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.672 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.673 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.674 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.674 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.675 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.676 226890 DEBUG nova.virt.libvirt.driver [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:12 np0005588920 podman[254985]: 2026-01-20 14:43:12.682026581 +0000 UTC m=+1.267424272 container start 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.687 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:12 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [NOTICE]   (255047) : New worker (255049) forked
Jan 20 09:43:12 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [NOTICE]   (255047) : Loading success.
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.767 226890 INFO nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Took 13.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.768 226890 DEBUG nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.835 226890 INFO nova.compute.manager [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Took 14.02 seconds to build instance.#033[00m
Jan 20 09:43:12 np0005588920 nova_compute[226886]: 2026-01-20 14:43:12.852 226890 DEBUG oslo_concurrency.lockutils [None req-29c4e9d0-322e-4c2d-9cb2-c8a3a4528188 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.890 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 9e71fac0-8593-43ea-8364-0ad2c9111d03 in datapath e1c2787c-8a6c-4f83-bc72-392ade5fb67f unbound from our chassis#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.894 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e1c2787c-8a6c-4f83-bc72-392ade5fb67f#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.907 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[283aebae-e6eb-4c84-b8ad-405e07edc40b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.908 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape1c2787c-81 in ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.911 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape1c2787c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.911 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b9a7a5-9ce3-4168-8a21-e6a6880dfdf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.913 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[18093053-110b-48c2-9477-947681bbfc01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.929 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[77f95307-2a98-4e7f-b61b-29b017c42511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.943 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae75da5-a4d3-4968-93dd-d0cf50eca325]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.968 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[62a2b812-b551-443f-8d34-6de1fea1ce5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:12.976 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccacc50-d683-4b02-9075-d246afd7bf77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:12 np0005588920 NetworkManager[49076]: <info>  [1768920192.9777] manager: (tape1c2787c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Jan 20 09:43:12 np0005588920 systemd-udevd[254941]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.004 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[56fbe5d1-2b49-4e84-a029-ce171ac9ad67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.007 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0a2304ef-10f5-4326-9a02-5faa348b84ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 NetworkManager[49076]: <info>  [1768920193.0346] device (tape1c2787c-80): carrier: link connected
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.039 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[87220607-bdcc-4d2b-af45-9dd856fbe168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.056 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b289a4f6-7e0f-4ad7-b8ad-a9a242ddc201]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1c2787c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:15:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523472, 'reachable_time': 33662, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255070, 'error': None, 'target': 'ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2a010c70-3b59-49cd-93a0-3f6848603102]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:15e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523472, 'tstamp': 523472}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255071, 'error': None, 'target': 'ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.090 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0172f28a-d090-4516-a9a8-5c75cacbe8d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1c2787c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:15:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523472, 'reachable_time': 33662, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255072, 'error': None, 'target': 'ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.118 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[feeff678-fe7c-432c-9d99-a82bb9ef741d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.185 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d88dba1c-6ba9-4b15-a6ed-403854ee93ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.186 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1c2787c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.187 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.187 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1c2787c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588920 NetworkManager[49076]: <info>  [1768920193.1896] manager: (tape1c2787c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 20 09:43:13 np0005588920 kernel: tape1c2787c-80: entered promiscuous mode
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.190 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.194 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape1c2787c-80, col_values=(('external_ids', {'iface-id': '70618085-44e0-43f9-a18f-4879712f80f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.196 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:13Z|00309|binding|INFO|Releasing lport 70618085-44e0-43f9-a18f-4879712f80f5 from this chassis (sb_readonly=0)
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.199 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e1c2787c-8a6c-4f83-bc72-392ade5fb67f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e1c2787c-8a6c-4f83-bc72-392ade5fb67f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.200 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[92e29ed3-8811-4883-a0d4-e049d4cb3cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.200 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-e1c2787c-8a6c-4f83-bc72-392ade5fb67f
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/e1c2787c-8a6c-4f83-bc72-392ade5fb67f.pid.haproxy
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID e1c2787c-8a6c-4f83-bc72-392ade5fb67f
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:43:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:13.201 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'env', 'PROCESS_TAG=haproxy-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e1c2787c-8a6c-4f83-bc72-392ade5fb67f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:13.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:13 np0005588920 podman[255105]: 2026-01-20 14:43:13.559841531 +0000 UTC m=+0.046454446 container create 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:13 np0005588920 systemd[1]: Started libpod-conmon-60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668.scope.
Jan 20 09:43:13 np0005588920 podman[255105]: 2026-01-20 14:43:13.535494942 +0000 UTC m=+0.022107877 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:43:13 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:43:13 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bf69571189c889700360802a93965c3317cf6d7e8e46878885f3a01cfaf9994/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:43:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:43:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1120863099' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:43:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:43:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1120863099' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:43:13 np0005588920 podman[255105]: 2026-01-20 14:43:13.673300284 +0000 UTC m=+0.159913229 container init 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:43:13 np0005588920 podman[255105]: 2026-01-20 14:43:13.681299687 +0000 UTC m=+0.167912602 container start 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:43:13 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [NOTICE]   (255124) : New worker (255126) forked
Jan 20 09:43:13 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [NOTICE]   (255124) : Loading success.
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.803 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.804 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.804 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.805 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.805 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.807 226890 INFO nova.compute.manager [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Terminating instance#033[00m
Jan 20 09:43:13 np0005588920 nova_compute[226886]: 2026-01-20 14:43:13.808 226890 DEBUG nova.compute.manager [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:43:14 np0005588920 kernel: tap531c124f-8a (unregistering): left promiscuous mode
Jan 20 09:43:14 np0005588920 NetworkManager[49076]: <info>  [1768920194.0528] device (tap531c124f-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:14.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00310|binding|INFO|Releasing lport 531c124f-8aea-41ea-bdb8-428b944ad2a5 from this chassis (sb_readonly=0)
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00311|binding|INFO|Setting lport 531c124f-8aea-41ea-bdb8-428b944ad2a5 down in Southbound
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00312|binding|INFO|Removing iface tap531c124f-8a ovn-installed in OVS
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.063 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.068 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:24:d8 10.100.0.222'], port_security=['fa:16:3e:cf:24:d8 10.100.0.222'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.222/24', 'neutron:device_id': '8cd92082-94e5-46f3-992f-afb6b04a3801', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b22412b1-dc16-4643-ab31-b29b37d280c4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=531c124f-8aea-41ea-bdb8-428b944ad2a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.069 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 531c124f-8aea-41ea-bdb8-428b944ad2a5 in datapath 49c2fa90-1bfc-42c2-8a83-e0a4d3888265 unbound from our chassis#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.070 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49c2fa90-1bfc-42c2-8a83-e0a4d3888265, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[49548dcc-c269-4a46-ad01-11036fabbd1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.071 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265 namespace which is not needed anymore#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.079 226890 DEBUG nova.compute.manager [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.080 226890 DEBUG oslo_concurrency.lockutils [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.080 226890 DEBUG oslo_concurrency.lockutils [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:14 np0005588920 kernel: tap9e71fac0-85 (unregistering): left promiscuous mode
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.081 226890 DEBUG oslo_concurrency.lockutils [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.084 226890 DEBUG nova.compute.manager [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.084 226890 WARNING nova.compute.manager [req-bb906044-eaaa-44ad-baa8-136826e2cbbe req-3d61d88b-c849-4920-aacf-0133b896786b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received unexpected event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:43:14 np0005588920 NetworkManager[49076]: <info>  [1768920194.0880] device (tap9e71fac0-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00313|binding|INFO|Releasing lport 9e71fac0-8593-43ea-8364-0ad2c9111d03 from this chassis (sb_readonly=0)
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00314|binding|INFO|Setting lport 9e71fac0-8593-43ea-8364-0ad2c9111d03 down in Southbound
Jan 20 09:43:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:14Z|00315|binding|INFO|Removing iface tap9e71fac0-85 ovn-installed in OVS
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.112 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:42:db 10.100.1.39'], port_security=['fa:16:3e:cd:42:db 10.100.1.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.39/24', 'neutron:device_id': '8cd92082-94e5-46f3-992f-afb6b04a3801', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13c0d93976f745dba4ab050770ccaae6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '774782e1-5f70-48c6-908e-055253fbcd30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da894a36-fdaf-403d-9ea0-4526779be5e4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=9e71fac0-8593-43ea-8364-0ad2c9111d03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 20 09:43:14 np0005588920 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004f.scope: Consumed 1.987s CPU time.
Jan 20 09:43:14 np0005588920 systemd-machined[196121]: Machine qemu-33-instance-0000004f terminated.
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [NOTICE]   (255047) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [NOTICE]   (255047) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [WARNING]  (255047) : Exiting Master process...
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [ALERT]    (255047) : Current worker (255049) exited with code 143 (Terminated)
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265[255043]: [WARNING]  (255047) : All workers exited. Exiting... (0)
Jan 20 09:43:14 np0005588920 systemd[1]: libpod-3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e.scope: Deactivated successfully.
Jan 20 09:43:14 np0005588920 conmon[255043]: conmon 3d1b65036365ec9ac995 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e.scope/container/memory.events
Jan 20 09:43:14 np0005588920 NetworkManager[49076]: <info>  [1768920194.2434] manager: (tap9e71fac0-85): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Jan 20 09:43:14 np0005588920 podman[255164]: 2026-01-20 14:43:14.245462894 +0000 UTC m=+0.052309589 container died 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.260 226890 INFO nova.virt.libvirt.driver [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Instance destroyed successfully.#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.260 226890 DEBUG nova.objects.instance [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lazy-loading 'resources' on Instance uuid 8cd92082-94e5-46f3-992f-afb6b04a3801 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-3f4071adb58cfc4531fdaa1d1fb7e2f5bb643a655903f81b48c412b1a6012383-merged.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.286 226890 DEBUG nova.virt.libvirt.vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:43:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:43:12Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.287 226890 DEBUG nova.network.os_vif_util [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "address": "fa:16:3e:cf:24:d8", "network": {"id": "49c2fa90-1bfc-42c2-8a83-e0a4d3888265", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-551973941", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap531c124f-8a", "ovs_interfaceid": "531c124f-8aea-41ea-bdb8-428b944ad2a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.288 226890 DEBUG nova.network.os_vif_util [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.288 226890 DEBUG os_vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.290 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.290 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap531c124f-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.292 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.297 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.301 226890 INFO os_vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:24:d8,bridge_name='br-int',has_traffic_filtering=True,id=531c124f-8aea-41ea-bdb8-428b944ad2a5,network=Network(49c2fa90-1bfc-42c2-8a83-e0a4d3888265),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap531c124f-8a')#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.302 226890 DEBUG nova.virt.libvirt.vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:42:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1117384226',display_name='tempest-ServersTestMultiNic-server-1117384226',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1117384226',id=79,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:43:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13c0d93976f745dba4ab050770ccaae6',ramdisk_id='',reservation_id='r-w391cf92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1634662961',owner_user_name='tempest-ServersTestMultiNic-1634662961-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:43:12Z,user_data=None,user_id='6a00517a957e4ceb8564cbf1dfa15ee2',uuid=8cd92082-94e5-46f3-992f-afb6b04a3801,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.303 226890 DEBUG nova.network.os_vif_util [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converting VIF {"id": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "address": "fa:16:3e:cd:42:db", "network": {"id": "e1c2787c-8a6c-4f83-bc72-392ade5fb67f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1156837827", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13c0d93976f745dba4ab050770ccaae6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e71fac0-85", "ovs_interfaceid": "9e71fac0-8593-43ea-8364-0ad2c9111d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.304 226890 DEBUG nova.network.os_vif_util [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.304 226890 DEBUG os_vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.305 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.306 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e71fac0-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.308 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 podman[255164]: 2026-01-20 14:43:14.309708505 +0000 UTC m=+0.116555150 container cleanup 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.313 226890 INFO os_vif [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:42:db,bridge_name='br-int',has_traffic_filtering=True,id=9e71fac0-8593-43ea-8364-0ad2c9111d03,network=Network(e1c2787c-8a6c-4f83-bc72-392ade5fb67f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e71fac0-85')#033[00m
Jan 20 09:43:14 np0005588920 systemd[1]: libpod-conmon-3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e.scope: Deactivated successfully.
Jan 20 09:43:14 np0005588920 podman[255216]: 2026-01-20 14:43:14.389124839 +0000 UTC m=+0.045970983 container remove 3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.395 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c390def7-b1fe-498c-964c-d67f317f3fd6]: (4, ('Tue Jan 20 02:43:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265 (3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e)\n3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e\nTue Jan 20 02:43:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265 (3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e)\n3d1b65036365ec9ac99574c223595b26f81e77d6087d66e6e02b6892c78b941e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.397 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60cfc48d-b95d-4911-9856-3097131de60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.399 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49c2fa90-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.401 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 kernel: tap49c2fa90-10: left promiscuous mode
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.415 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.418 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a98c289b-e4d3-468e-8af3-f1c5e50818d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.432 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[545947f2-a86c-4cc0-809b-85d55cb89ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.434 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[309b7b11-9c76-4224-91c9-a89191f9e808]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.451 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17414767-42e2-44a9-8d9f-7e7262347fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523257, 'reachable_time': 26985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255246, 'error': None, 'target': 'ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2d49c2fa90\x2d1bfc\x2d42c2\x2d8a83\x2de0a4d3888265.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.455 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-49c2fa90-1bfc-42c2-8a83-e0a4d3888265 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.455 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[19784fa8-34ac-4675-b366-72ec4afb1e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.456 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 9e71fac0-8593-43ea-8364-0ad2c9111d03 in datapath e1c2787c-8a6c-4f83-bc72-392ade5fb67f unbound from our chassis#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.458 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e1c2787c-8a6c-4f83-bc72-392ade5fb67f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.459 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0e46cf-ca32-4c47-96a1-e7e5a1986982]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.459 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f namespace which is not needed anymore#033[00m
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [NOTICE]   (255124) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [NOTICE]   (255124) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [WARNING]  (255124) : Exiting Master process...
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [WARNING]  (255124) : Exiting Master process...
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [ALERT]    (255124) : Current worker (255126) exited with code 143 (Terminated)
Jan 20 09:43:14 np0005588920 neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f[255120]: [WARNING]  (255124) : All workers exited. Exiting... (0)
Jan 20 09:43:14 np0005588920 systemd[1]: libpod-60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668.scope: Deactivated successfully.
Jan 20 09:43:14 np0005588920 podman[255265]: 2026-01-20 14:43:14.57930912 +0000 UTC m=+0.042483105 container died 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:43:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-6bf69571189c889700360802a93965c3317cf6d7e8e46878885f3a01cfaf9994-merged.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 podman[255265]: 2026-01-20 14:43:14.61337477 +0000 UTC m=+0.076548765 container cleanup 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:43:14 np0005588920 systemd[1]: libpod-conmon-60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668.scope: Deactivated successfully.
Jan 20 09:43:14 np0005588920 podman[255298]: 2026-01-20 14:43:14.678167276 +0000 UTC m=+0.041606091 container remove 60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.679 226890 INFO nova.virt.libvirt.driver [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Deleting instance files /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801_del#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.679 226890 INFO nova.virt.libvirt.driver [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Deletion of /var/lib/nova/instances/8cd92082-94e5-46f3-992f-afb6b04a3801_del complete#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.683 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26724142-5c3a-4770-9097-226a34ffcce3]: (4, ('Tue Jan 20 02:43:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f (60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668)\n60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668\nTue Jan 20 02:43:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f (60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668)\n60437d9ab638ff02ce9712c88853d462c003f2217eea7b26bf149112f4db6668\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.684 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7db34872-d42d-4850-8421-89a633481102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.685 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1c2787c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588920 kernel: tape1c2787c-80: left promiscuous mode
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.687 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.700 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.702 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[04816129-5f4a-4b90-b6b1-52422dabc29a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.711 226890 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-unplugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.711 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.711 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.712 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.712 226890 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-unplugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.712 226890 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-unplugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.712 226890 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.713 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.713 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.713 226890 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.713 226890 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.713 226890 WARNING nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received unexpected event network-vif-plugged-531c124f-8aea-41ea-bdb8-428b944ad2a5 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.719 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6129d9fe-2355-4035-a19d-a30936eb36af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.721 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac310f3-4f47-4fe1-b589-fe7bdf7361df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.737 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3975f4f7-e68b-4bb1-a038-7c9e1bf45634]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523465, 'reachable_time': 30470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255313, 'error': None, 'target': 'ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.739 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e1c2787c-8a6c-4f83-bc72-392ade5fb67f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:14.739 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3802bc7b-9764-4136-a232-48161aab2dee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2de1c2787c\x2d8a6c\x2d4f83\x2dbc72\x2d392ade5fb67f.mount: Deactivated successfully.
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.771 226890 INFO nova.compute.manager [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.773 226890 DEBUG oslo.service.loopingcall [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.774 226890 DEBUG nova.compute.manager [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:43:14 np0005588920 nova_compute[226886]: 2026-01-20 14:43:14.774 226890 DEBUG nova.network.neutron [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:43:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:15.028 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:15 np0005588920 nova_compute[226886]: 2026-01-20 14:43:15.028 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:15.029 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:43:15 np0005588920 nova_compute[226886]: 2026-01-20 14:43:15.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:15.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:16.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:16.446 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:16.447 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:16.447 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.578 226890 DEBUG nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-unplugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.579 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.579 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.579 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.579 226890 DEBUG nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-unplugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.579 226890 DEBUG nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-unplugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 DEBUG nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 DEBUG oslo_concurrency.lockutils [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 DEBUG nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] No waiting events found dispatching network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:16 np0005588920 nova_compute[226886]: 2026-01-20 14:43:16.580 226890 WARNING nova.compute.manager [req-dbd2cf3b-503f-44db-aaae-6f706538d49d req-eb947aeb-7735-48ed-8645-b2e45e6d4167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received unexpected event network-vif-plugged-9e71fac0-8593-43ea-8364-0ad2c9111d03 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:43:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:17.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:17 np0005588920 nova_compute[226886]: 2026-01-20 14:43:17.515 226890 DEBUG nova.network.neutron [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:17 np0005588920 nova_compute[226886]: 2026-01-20 14:43:17.618 226890 INFO nova.compute.manager [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Took 2.84 seconds to deallocate network for instance.#033[00m
Jan 20 09:43:17 np0005588920 nova_compute[226886]: 2026-01-20 14:43:17.714 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:17 np0005588920 nova_compute[226886]: 2026-01-20 14:43:17.715 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:17 np0005588920 nova_compute[226886]: 2026-01-20 14:43:17.805 226890 DEBUG oslo_concurrency.processutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:18.030 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:18.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3490191056' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.250 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920183.2496107, 525b8695-a4df-46c5-875a-42d3b18b78be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.251 226890 INFO nova.compute.manager [-] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.264 226890 DEBUG oslo_concurrency.processutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.270 226890 DEBUG nova.compute.manager [None req-fee87220-d473-462e-9088-362bbe871b0a - - - - - -] [instance: 525b8695-a4df-46c5-875a-42d3b18b78be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.274 226890 DEBUG nova.compute.provider_tree [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.309 226890 DEBUG nova.scheduler.client.report [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.345 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.369 226890 INFO nova.scheduler.client.report [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Deleted allocations for instance 8cd92082-94e5-46f3-992f-afb6b04a3801#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.439 226890 DEBUG oslo_concurrency.lockutils [None req-c57d7971-bbf0-4e19-87cc-f0e72faab993 6a00517a957e4ceb8564cbf1dfa15ee2 13c0d93976f745dba4ab050770ccaae6 - - default default] Lock "8cd92082-94e5-46f3-992f-afb6b04a3801" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:18 np0005588920 nova_compute[226886]: 2026-01-20 14:43:18.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:19 np0005588920 nova_compute[226886]: 2026-01-20 14:43:19.309 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:19.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:19 np0005588920 nova_compute[226886]: 2026-01-20 14:43:19.631 226890 DEBUG nova.compute.manager [req-20cbf48d-cd14-4c4d-be2e-409d5abd4d46 req-0078c1c9-78c2-4c7f-8084-f93bd3812f19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-deleted-531c124f-8aea-41ea-bdb8-428b944ad2a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:19 np0005588920 nova_compute[226886]: 2026-01-20 14:43:19.632 226890 DEBUG nova.compute.manager [req-20cbf48d-cd14-4c4d-be2e-409d5abd4d46 req-0078c1c9-78c2-4c7f-8084-f93bd3812f19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Received event network-vif-deleted-9e71fac0-8593-43ea-8364-0ad2c9111d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:20.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:20 np0005588920 nova_compute[226886]: 2026-01-20 14:43:20.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:21.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:22.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:23.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:24.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:24 np0005588920 nova_compute[226886]: 2026-01-20 14:43:24.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:24 np0005588920 nova_compute[226886]: 2026-01-20 14:43:24.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:24 np0005588920 nova_compute[226886]: 2026-01-20 14:43:24.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:43:24 np0005588920 nova_compute[226886]: 2026-01-20 14:43:24.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:43:24 np0005588920 nova_compute[226886]: 2026-01-20 14:43:24.784 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:43:25 np0005588920 nova_compute[226886]: 2026-01-20 14:43:25.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:25.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:25 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:43:25 np0005588920 nova_compute[226886]: 2026-01-20 14:43:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:27.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:28.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.259 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920194.2582843, 8cd92082-94e5-46f3-992f-afb6b04a3801 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.260 226890 INFO nova.compute.manager [-] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.290 226890 DEBUG nova.compute.manager [None req-625f10f6-04c9-4ed2-b885-0f7eb6cc568a - - - - - -] [instance: 8cd92082-94e5-46f3-992f-afb6b04a3801] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.314 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:29.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.743 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.745 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.745 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:29 np0005588920 nova_compute[226886]: 2026-01-20 14:43:29.746 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:43:30 np0005588920 podman[255336]: 2026-01-20 14:43:30.048050635 +0000 UTC m=+0.130592582 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:43:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:30.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:30 np0005588920 nova_compute[226886]: 2026-01-20 14:43:30.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:43:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:31.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:43:31 np0005588920 nova_compute[226886]: 2026-01-20 14:43:31.758 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:31 np0005588920 nova_compute[226886]: 2026-01-20 14:43:31.759 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:43:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:32.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.786 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.786 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.786 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.787 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:43:32 np0005588920 nova_compute[226886]: 2026-01-20 14:43:32.787 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/23827069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.214 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.344 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.345 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4628MB free_disk=20.967357635498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.346 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.346 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:33.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.657 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.658 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:43:33 np0005588920 nova_compute[226886]: 2026-01-20 14:43:33.867 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.052 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.053 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:43:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:34.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.100 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.153 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.198 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2644878239' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.671 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.677 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.705 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.763 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.764 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.764 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:43:34 np0005588920 nova_compute[226886]: 2026-01-20 14:43:34.787 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:43:35 np0005588920 nova_compute[226886]: 2026-01-20 14:43:35.235 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:35.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:35 np0005588920 nova_compute[226886]: 2026-01-20 14:43:35.788 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:35 np0005588920 nova_compute[226886]: 2026-01-20 14:43:35.788 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:36.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:36 np0005588920 nova_compute[226886]: 2026-01-20 14:43:36.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.267 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.267 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.314 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:43:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:37.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.401 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.402 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.413 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.413 226890 INFO nova.compute.claims [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.673 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:37 np0005588920 nova_compute[226886]: 2026-01-20 14:43:37.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:38.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4057377900' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.131 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.136 226890 DEBUG nova.compute.provider_tree [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.156 226890 DEBUG nova.scheduler.client.report [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.185 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.186 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:43:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/23348796' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.290 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.291 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.335 226890 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.392 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.533 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.534 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.534 226890 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Creating image(s)#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.565 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.593 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.625 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.630 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.696 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.697 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.698 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.699 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.727 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.731 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:38 np0005588920 nova_compute[226886]: 2026-01-20 14:43:38.951 226890 DEBUG nova.policy [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2975742546164cad937d13671d17108a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28a523cfe06042ff96554913a78e1e3a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.085 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.168 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] resizing rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.320 226890 DEBUG nova.objects.instance [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'migration_context' on Instance uuid 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:39.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.395 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.396 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Ensure instance console log exists: /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.396 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.397 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:39 np0005588920 nova_compute[226886]: 2026-01-20 14:43:39.397 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:40.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:40 np0005588920 nova_compute[226886]: 2026-01-20 14:43:40.279 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:41 np0005588920 nova_compute[226886]: 2026-01-20 14:43:41.036 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Successfully created port: 5b25aa23-d016-4a23-85f7-5ba4a1159f2c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:41.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:41 np0005588920 podman[255596]: 2026-01-20 14:43:41.971768571 +0000 UTC m=+0.052626598 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 09:43:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:42.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:43.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.733 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Successfully updated port: 5b25aa23-d016-4a23-85f7-5ba4a1159f2c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.762 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.762 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquired lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.762 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.993 226890 DEBUG nova.compute.manager [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-changed-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.994 226890 DEBUG nova.compute.manager [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Refreshing instance network info cache due to event network-changed-5b25aa23-d016-4a23-85f7-5ba4a1159f2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:43 np0005588920 nova_compute[226886]: 2026-01-20 14:43:43.994 226890 DEBUG oslo_concurrency.lockutils [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:44 np0005588920 nova_compute[226886]: 2026-01-20 14:43:44.088 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:44.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:44 np0005588920 nova_compute[226886]: 2026-01-20 14:43:44.365 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:45.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.440 226890 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Updating instance_info_cache with network_info: [{"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.473 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Releasing lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.473 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Instance network_info: |[{"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.474 226890 DEBUG oslo_concurrency.lockutils [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.474 226890 DEBUG nova.network.neutron [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Refreshing network info cache for port 5b25aa23-d016-4a23-85f7-5ba4a1159f2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.478 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Start _get_guest_xml network_info=[{"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.483 226890 WARNING nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.490 226890 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.491 226890 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.499 226890 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.499 226890 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.501 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.501 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.502 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.502 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.502 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.503 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.503 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.503 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.504 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.504 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.505 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.505 226890 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:43:45 np0005588920 nova_compute[226886]: 2026-01-20 14:43:45.507 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:46.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3855425524' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.363 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.855s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.401 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.405 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1131127473' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.841 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.843 226890 DEBUG nova.virt.libvirt.vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-1',id=83,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:38Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.843 226890 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.844 226890 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.845 226890 DEBUG nova.objects.instance [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.876 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <uuid>02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6</uuid>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <name>instance-00000053</name>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1818827013-1</nova:name>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:43:45</nova:creationTime>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:user uuid="2975742546164cad937d13671d17108a">tempest-ListServersNegativeTestJSON-1080060493-project-member</nova:user>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:project uuid="28a523cfe06042ff96554913a78e1e3a">tempest-ListServersNegativeTestJSON-1080060493</nova:project>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <nova:port uuid="5b25aa23-d016-4a23-85f7-5ba4a1159f2c">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="serial">02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="uuid">02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:fd:7c:21"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <target dev="tap5b25aa23-d0"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/console.log" append="off"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:43:46 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:43:46 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:43:46 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:43:46 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.877 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Preparing to wait for external event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.877 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.877 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.878 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.878 226890 DEBUG nova.virt.libvirt.vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-1',id=83,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:38Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.878 226890 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.879 226890 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.879 226890 DEBUG os_vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.882 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.882 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.882 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.885 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b25aa23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.886 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b25aa23-d0, col_values=(('external_ids', {'iface-id': '5b25aa23-d016-4a23-85f7-5ba4a1159f2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:7c:21', 'vm-uuid': '02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.887 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:46 np0005588920 NetworkManager[49076]: <info>  [1768920226.8884] manager: (tap5b25aa23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.889 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:46 np0005588920 nova_compute[226886]: 2026-01-20 14:43:46.892 226890 INFO os_vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0')#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.102 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.103 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.104 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No VIF found with MAC fa:16:3e:fd:7c:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.105 226890 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Using config drive#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.145 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:47.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.977 226890 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Creating config drive at /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config#033[00m
Jan 20 09:43:47 np0005588920 nova_compute[226886]: 2026-01-20 14:43:47.983 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8onssq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:48.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.116 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8onssq0" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.151 226890 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.154 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.327 226890 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.328 226890 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Deleting local config drive /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6/disk.config because it was imported into RBD.#033[00m
Jan 20 09:43:48 np0005588920 kernel: tap5b25aa23-d0: entered promiscuous mode
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.3951] manager: (tap5b25aa23-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 20 09:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:48Z|00316|binding|INFO|Claiming lport 5b25aa23-d016-4a23-85f7-5ba4a1159f2c for this chassis.
Jan 20 09:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:48Z|00317|binding|INFO|5b25aa23-d016-4a23-85f7-5ba4a1159f2c: Claiming fa:16:3e:fd:7c:21 10.100.0.12
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.412 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.423 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:7c:21 10.100.0.12'], port_security=['fa:16:3e:fd:7c:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53d0b281-776f-4682-8aaf-098e1d364008', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28a523cfe06042ff96554913a78e1e3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1879c269-0854-40a3-8eb9-b61f97d38545', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1afefec-2060-4dfb-acbb-1ce14c3a663c, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5b25aa23-d016-4a23-85f7-5ba4a1159f2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.425 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5b25aa23-d016-4a23-85f7-5ba4a1159f2c in datapath 53d0b281-776f-4682-8aaf-098e1d364008 bound to our chassis#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.428 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 53d0b281-776f-4682-8aaf-098e1d364008#033[00m
Jan 20 09:43:48 np0005588920 systemd-udevd[255750]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:48 np0005588920 systemd-machined[196121]: New machine qemu-34-instance-00000053.
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.441 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe5a550-3704-4bda-aa4e-861f598bebfc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.442 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap53d0b281-71 in ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.444 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap53d0b281-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.444 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1df42a6e-dc47-409c-b42d-593116de370d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.445 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[55d83e98-df67-4980-aeae-2e9b0a3d21e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 systemd[1]: Started Virtual Machine qemu-34-instance-00000053.
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.4584] device (tap5b25aa23-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.4608] device (tap5b25aa23-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.463 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e70ed2-c3d9-4952-be93-98743cc84368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.488 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1dae55f7-8518-4946-a5bf-a20d0cd598d1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:48Z|00318|binding|INFO|Setting lport 5b25aa23-d016-4a23-85f7-5ba4a1159f2c ovn-installed in OVS
Jan 20 09:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:48Z|00319|binding|INFO|Setting lport 5b25aa23-d016-4a23-85f7-5ba4a1159f2c up in Southbound
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.520 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bf0e55f5-e0d9-4091-a85a-18fa844f0497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.525 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b3cc5d-4dbc-4b64-b019-e0e205d227f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.5259] manager: (tap53d0b281-70): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.554 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a44d3e69-6793-4c13-96f1-a2438e191a05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.557 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[831845c3-d90f-4b85-b2c3-d59cada4366a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.5806] device (tap53d0b281-70): carrier: link connected
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.586 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cd0d4ffc-f7c8-4a79-bc26-474f9ef26298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.599 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0146f0ec-ed20-4133-8ff6-5282df988cba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53d0b281-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:be:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527026, 'reachable_time': 21886, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255783, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.612 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[970ac1a2-3714-4315-87b4-b51e27f3056f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:befe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527026, 'tstamp': 527026}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255784, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.625 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8357eb91-cbdc-42d6-9b45-255a811e79e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53d0b281-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:be:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527026, 'reachable_time': 21886, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255785, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.646 226890 DEBUG nova.network.neutron [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Updated VIF entry in instance network info cache for port 5b25aa23-d016-4a23-85f7-5ba4a1159f2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.647 226890 DEBUG nova.network.neutron [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Updating instance_info_cache with network_info: [{"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.654 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68d4f6c2-9e45-46d2-8acf-3b74917695d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.686 226890 DEBUG oslo_concurrency.lockutils [req-c0cfe1a9-4b19-4a58-a47e-e8cf764b526f req-34aa0b7d-22a5-42fd-904d-80dff64e3c3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.709 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ff776b15-4309-4b5d-ace6-9a6f8c1938d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.710 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d0b281-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.710 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.711 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53d0b281-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.712 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 NetworkManager[49076]: <info>  [1768920228.7134] manager: (tap53d0b281-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 20 09:43:48 np0005588920 kernel: tap53d0b281-70: entered promiscuous mode
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.715 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.716 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap53d0b281-70, col_values=(('external_ids', {'iface-id': '2ea34810-4753-414f-ae43-b7b379fc432c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.717 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:48Z|00320|binding|INFO|Releasing lport 2ea34810-4753-414f-ae43-b7b379fc432c from this chassis (sb_readonly=0)
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.731 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.732 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.733 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed45f68-dd26-4228-9580-b339f7a940f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.733 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-53d0b281-776f-4682-8aaf-098e1d364008
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 53d0b281-776f-4682-8aaf-098e1d364008
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:48.734 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'env', 'PROCESS_TAG=haproxy-53d0b281-776f-4682-8aaf-098e1d364008', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/53d0b281-776f-4682-8aaf-098e1d364008.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.847 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920228.8470073, 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.848 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] VM Started (Lifecycle Event)#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.895 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.900 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920228.8473458, 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.900 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.940 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.945 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:48 np0005588920 nova_compute[226886]: 2026-01-20 14:43:48.971 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:49 np0005588920 podman[255859]: 2026-01-20 14:43:49.082361707 +0000 UTC m=+0.047405732 container create f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:43:49 np0005588920 systemd[1]: Started libpod-conmon-f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63.scope.
Jan 20 09:43:49 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:43:49 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02be0061536e42dd6b7fe2f284be0f8a3d098da32fe85d4f366716173606c28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:43:49 np0005588920 podman[255859]: 2026-01-20 14:43:49.057659649 +0000 UTC m=+0.022703694 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:43:49 np0005588920 podman[255859]: 2026-01-20 14:43:49.163992123 +0000 UTC m=+0.129036188 container init f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:49 np0005588920 podman[255859]: 2026-01-20 14:43:49.169484156 +0000 UTC m=+0.134528181 container start f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:43:49 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [NOTICE]   (255879) : New worker (255881) forked
Jan 20 09:43:49 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [NOTICE]   (255879) : Loading success.
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.352 226890 DEBUG nova.compute.manager [req-b6c48e7f-712c-4973-b97e-0cafcc152a15 req-12f2dbe5-4c4d-42f6-b027-3c3104a10423 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.353 226890 DEBUG oslo_concurrency.lockutils [req-b6c48e7f-712c-4973-b97e-0cafcc152a15 req-12f2dbe5-4c4d-42f6-b027-3c3104a10423 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.353 226890 DEBUG oslo_concurrency.lockutils [req-b6c48e7f-712c-4973-b97e-0cafcc152a15 req-12f2dbe5-4c4d-42f6-b027-3c3104a10423 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.354 226890 DEBUG oslo_concurrency.lockutils [req-b6c48e7f-712c-4973-b97e-0cafcc152a15 req-12f2dbe5-4c4d-42f6-b027-3c3104a10423 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.354 226890 DEBUG nova.compute.manager [req-b6c48e7f-712c-4973-b97e-0cafcc152a15 req-12f2dbe5-4c4d-42f6-b027-3c3104a10423 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Processing event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.354 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.358 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920229.3584833, 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.359 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.361 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.364 226890 INFO nova.virt.libvirt.driver [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Instance spawned successfully.#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.365 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:43:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:49.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.404 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.408 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.412 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.412 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.412 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.413 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.413 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.414 226890 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.444 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.483 226890 INFO nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Took 10.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.483 226890 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.629 226890 INFO nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Took 12.25 seconds to build instance.#033[00m
Jan 20 09:43:49 np0005588920 nova_compute[226886]: 2026-01-20 14:43:49.655 226890 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:50.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:50 np0005588920 nova_compute[226886]: 2026-01-20 14:43:50.282 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:51.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.453 226890 DEBUG nova.compute.manager [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.453 226890 DEBUG oslo_concurrency.lockutils [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.454 226890 DEBUG oslo_concurrency.lockutils [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.454 226890 DEBUG oslo_concurrency.lockutils [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.454 226890 DEBUG nova.compute.manager [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] No waiting events found dispatching network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.455 226890 WARNING nova.compute.manager [req-2a6b6f8b-b1f8-4294-8c02-b0591fec0714 req-19178456-c4a8-435a-96c5-d761818d1b71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received unexpected event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c for instance with vm_state active and task_state None.#033[00m
Jan 20 09:43:51 np0005588920 nova_compute[226886]: 2026-01-20 14:43:51.891 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:43:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:52.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:43:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:53.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.474 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.475 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.475 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.475 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.475 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.477 226890 INFO nova.compute.manager [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Terminating instance#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.477 226890 DEBUG nova.compute.manager [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:43:53 np0005588920 kernel: tap5b25aa23-d0 (unregistering): left promiscuous mode
Jan 20 09:43:53 np0005588920 NetworkManager[49076]: <info>  [1768920233.5129] device (tap5b25aa23-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:53Z|00321|binding|INFO|Releasing lport 5b25aa23-d016-4a23-85f7-5ba4a1159f2c from this chassis (sb_readonly=0)
Jan 20 09:43:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:53Z|00322|binding|INFO|Setting lport 5b25aa23-d016-4a23-85f7-5ba4a1159f2c down in Southbound
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:43:53Z|00323|binding|INFO|Removing iface tap5b25aa23-d0 ovn-installed in OVS
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.527 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:7c:21 10.100.0.12'], port_security=['fa:16:3e:fd:7c:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53d0b281-776f-4682-8aaf-098e1d364008', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28a523cfe06042ff96554913a78e1e3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1879c269-0854-40a3-8eb9-b61f97d38545', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1afefec-2060-4dfb-acbb-1ce14c3a663c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5b25aa23-d016-4a23-85f7-5ba4a1159f2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.528 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5b25aa23-d016-4a23-85f7-5ba4a1159f2c in datapath 53d0b281-776f-4682-8aaf-098e1d364008 unbound from our chassis#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.530 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 53d0b281-776f-4682-8aaf-098e1d364008, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.531 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[91b7e8e8-bbf8-4a8e-b721-cffefc1a0eb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.532 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 namespace which is not needed anymore#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.542 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 20 09:43:53 np0005588920 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000053.scope: Consumed 4.649s CPU time.
Jan 20 09:43:53 np0005588920 systemd-machined[196121]: Machine qemu-34-instance-00000053 terminated.
Jan 20 09:43:53 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [NOTICE]   (255879) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:53 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [NOTICE]   (255879) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:53 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [ALERT]    (255879) : Current worker (255881) exited with code 143 (Terminated)
Jan 20 09:43:53 np0005588920 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[255875]: [WARNING]  (255879) : All workers exited. Exiting... (0)
Jan 20 09:43:53 np0005588920 systemd[1]: libpod-f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63.scope: Deactivated successfully.
Jan 20 09:43:53 np0005588920 podman[255913]: 2026-01-20 14:43:53.657174884 +0000 UTC m=+0.042367852 container died f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:43:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:53 np0005588920 systemd[1]: var-lib-containers-storage-overlay-f02be0061536e42dd6b7fe2f284be0f8a3d098da32fe85d4f366716173606c28-merged.mount: Deactivated successfully.
Jan 20 09:43:53 np0005588920 podman[255913]: 2026-01-20 14:43:53.687250063 +0000 UTC m=+0.072443031 container cleanup f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:43:53 np0005588920 systemd[1]: libpod-conmon-f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63.scope: Deactivated successfully.
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.717 226890 INFO nova.virt.libvirt.driver [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Instance destroyed successfully.#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.717 226890 DEBUG nova.objects.instance [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'resources' on Instance uuid 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:53 np0005588920 podman[255945]: 2026-01-20 14:43:53.748452769 +0000 UTC m=+0.039591635 container remove f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.748 226890 DEBUG nova.virt.libvirt.vif [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-1',id=83,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:43:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:43:49Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.748 226890 DEBUG nova.network.os_vif_util [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "address": "fa:16:3e:fd:7c:21", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b25aa23-d0", "ovs_interfaceid": "5b25aa23-d016-4a23-85f7-5ba4a1159f2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.749 226890 DEBUG nova.network.os_vif_util [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.749 226890 DEBUG os_vif [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.751 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.751 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b25aa23-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.753 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.754 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.754 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eb095ce2-00a9-4d47-99d6-e07eb116a687]: (4, ('Tue Jan 20 02:43:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 (f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63)\nf319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63\nTue Jan 20 02:43:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 (f319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63)\nf319567e6fb9a335d24cde9a5a9cd128c888c75be4bb6acb64e342a25cdb7f63\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.756 226890 INFO os_vif [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7c:21,bridge_name='br-int',has_traffic_filtering=True,id=5b25aa23-d016-4a23-85f7-5ba4a1159f2c,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b25aa23-d0')#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.757 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b38adae4-6f40-4fd4-9fe5-949e191f20e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.758 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d0b281-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:53 np0005588920 kernel: tap53d0b281-70: left promiscuous mode
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.770 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 nova_compute[226886]: 2026-01-20 14:43:53.773 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.775 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[097d43e8-6bb7-41c2-a85b-4c5e0ce268f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.787 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[63cf8799-6983-4ce5-949d-8fa8a9aae272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.788 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2853f23f-5c72-4a99-8742-f1fe26366521]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.802 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4557f85c-c112-4a07-9e3b-b746e546ea77]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527020, 'reachable_time': 37156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255985, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.806 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:53 np0005588920 systemd[1]: run-netns-ovnmeta\x2d53d0b281\x2d776f\x2d4682\x2d8aaf\x2d098e1d364008.mount: Deactivated successfully.
Jan 20 09:43:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:43:53.806 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5c809cfb-f63a-45a7-bfde-d1c5cbab4d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.103 226890 INFO nova.virt.libvirt.driver [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Deleting instance files /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_del#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.105 226890 INFO nova.virt.libvirt.driver [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Deletion of /var/lib/nova/instances/02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6_del complete#033[00m
Jan 20 09:43:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:54.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.166 226890 INFO nova.compute.manager [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.167 226890 DEBUG oslo.service.loopingcall [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.167 226890 DEBUG nova.compute.manager [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.167 226890 DEBUG nova.network.neutron [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.635 226890 DEBUG nova.compute.manager [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-unplugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.636 226890 DEBUG oslo_concurrency.lockutils [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.636 226890 DEBUG oslo_concurrency.lockutils [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.636 226890 DEBUG oslo_concurrency.lockutils [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.636 226890 DEBUG nova.compute.manager [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] No waiting events found dispatching network-vif-unplugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:54 np0005588920 nova_compute[226886]: 2026-01-20 14:43:54.637 226890 DEBUG nova.compute.manager [req-b1fc4530-d218-4738-a082-c94b58251678 req-c01d4a27-404b-4788-8a2f-d8d8bcde7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-unplugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.223 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.224 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.254 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.284 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.323 226890 DEBUG nova.network.neutron [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.348 226890 INFO nova.compute.manager [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.378 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.378 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:55.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.386 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.387 226890 INFO nova.compute.claims [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.419 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.423 226890 DEBUG nova.compute.manager [req-98b435ef-078a-4767-8332-2a72e91bd8a4 req-b4943b9b-670e-4ad2-a352-9e9eb12289e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-deleted-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:55 np0005588920 nova_compute[226886]: 2026-01-20 14:43:55.583 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/278798223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.080 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.086 226890 DEBUG nova.compute.provider_tree [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.107 226890 DEBUG nova.scheduler.client.report [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:56.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.146 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.147 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.151 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.206 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.207 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.237 226890 INFO nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.258 226890 DEBUG oslo_concurrency.processutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.280 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:43:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.383 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.385 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.386 226890 INFO nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Creating image(s)#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.415 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.454 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.488 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.493 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.581 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.582 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.583 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.583 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.610 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.614 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.644 226890 DEBUG nova.policy [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37e9ef97fbe0448e9fbe32d48b66211f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b31139b2a4e49cba5e7048febf901c4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:43:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3948072205' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.678 226890 DEBUG oslo_concurrency.processutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.690 226890 DEBUG nova.compute.provider_tree [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.712 226890 DEBUG nova.scheduler.client.report [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.748 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.779 226890 DEBUG nova.compute.manager [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.779 226890 DEBUG oslo_concurrency.lockutils [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.779 226890 DEBUG oslo_concurrency.lockutils [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.780 226890 DEBUG oslo_concurrency.lockutils [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.780 226890 DEBUG nova.compute.manager [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] No waiting events found dispatching network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.780 226890 WARNING nova.compute.manager [req-8902fa0d-de2d-4f4a-ac53-b2f36830157f req-3715dba9-c292-47a8-a8a2-bd8fda27af95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Received unexpected event network-vif-plugged-5b25aa23-d016-4a23-85f7-5ba4a1159f2c for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.823 226890 INFO nova.scheduler.client.report [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Deleted allocations for instance 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6#033[00m
Jan 20 09:43:56 np0005588920 nova_compute[226886]: 2026-01-20 14:43:56.931 226890 DEBUG oslo_concurrency.lockutils [None req-32db6479-c0bf-4267-a3cd-7b1504010fc3 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.010 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.068 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] resizing rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.155 226890 DEBUG nova.objects.instance [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'migration_context' on Instance uuid e9dec9ac-568f-4ce4-a58a-351e3b2fff52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.173 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.174 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Ensure instance console log exists: /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.174 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.174 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.175 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.003000084s ======
Jan 20 09:43:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Jan 20 09:43:57 np0005588920 nova_compute[226886]: 2026-01-20 14:43:57.698 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Successfully created port: d222354f-f133-42a4-aaf5-a102e641821a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:43:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:43:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:43:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:58.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:58 np0005588920 nova_compute[226886]: 2026-01-20 14:43:58.754 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:58 np0005588920 nova_compute[226886]: 2026-01-20 14:43:58.957 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Successfully updated port: d222354f-f133-42a4-aaf5-a102e641821a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:58 np0005588920 nova_compute[226886]: 2026-01-20 14:43:58.979 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:58 np0005588920 nova_compute[226886]: 2026-01-20 14:43:58.979 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquired lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:58 np0005588920 nova_compute[226886]: 2026-01-20 14:43:58.979 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:43:59 np0005588920 nova_compute[226886]: 2026-01-20 14:43:59.120 226890 DEBUG nova.compute.manager [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-changed-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:59 np0005588920 nova_compute[226886]: 2026-01-20 14:43:59.121 226890 DEBUG nova.compute.manager [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Refreshing instance network info cache due to event network-changed-d222354f-f133-42a4-aaf5-a102e641821a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:59 np0005588920 nova_compute[226886]: 2026-01-20 14:43:59.121 226890 DEBUG oslo_concurrency.lockutils [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:59 np0005588920 nova_compute[226886]: 2026-01-20 14:43:59.246 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:43:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:00.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.655 226890 DEBUG nova.network.neutron [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Updating instance_info_cache with network_info: [{"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.692 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Releasing lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.692 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Instance network_info: |[{"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.693 226890 DEBUG oslo_concurrency.lockutils [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.693 226890 DEBUG nova.network.neutron [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Refreshing network info cache for port d222354f-f133-42a4-aaf5-a102e641821a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.697 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Start _get_guest_xml network_info=[{"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.700 226890 WARNING nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.706 226890 DEBUG nova.virt.libvirt.host [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.707 226890 DEBUG nova.virt.libvirt.host [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.712 226890 DEBUG nova.virt.libvirt.host [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.712 226890 DEBUG nova.virt.libvirt.host [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.713 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.713 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.714 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.714 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.714 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.715 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.715 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.715 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.715 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.716 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.716 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.716 226890 DEBUG nova.virt.hardware [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:00 np0005588920 nova_compute[226886]: 2026-01-20 14:44:00.719 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:01 np0005588920 podman[256347]: 2026-01-20 14:44:01.014022003 +0000 UTC m=+0.101026797 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:44:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1574054384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.137 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.159 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.163 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:01.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4262766353' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.711 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.713 226890 DEBUG nova.virt.libvirt.vif [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1094021331',display_name='tempest-DeleteServersTestJSON-server-1094021331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1094021331',id=86,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-rpusmqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:56Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=e9dec9ac-568f-4ce4-a58a-351e3b2fff52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.713 226890 DEBUG nova.network.os_vif_util [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.714 226890 DEBUG nova.network.os_vif_util [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.715 226890 DEBUG nova.objects.instance [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9dec9ac-568f-4ce4-a58a-351e3b2fff52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.778 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <uuid>e9dec9ac-568f-4ce4-a58a-351e3b2fff52</uuid>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <name>instance-00000056</name>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersTestJSON-server-1094021331</nova:name>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:44:00</nova:creationTime>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:user uuid="37e9ef97fbe0448e9fbe32d48b66211f">tempest-DeleteServersTestJSON-1162922273-project-member</nova:user>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:project uuid="3b31139b2a4e49cba5e7048febf901c4">tempest-DeleteServersTestJSON-1162922273</nova:project>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <nova:port uuid="d222354f-f133-42a4-aaf5-a102e641821a">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="serial">e9dec9ac-568f-4ce4-a58a-351e3b2fff52</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="uuid">e9dec9ac-568f-4ce4-a58a-351e3b2fff52</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:a7:d0:29"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <target dev="tapd222354f-f1"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/console.log" append="off"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:44:01 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:44:01 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:44:01 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:44:01 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.779 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Preparing to wait for external event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.780 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.781 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.781 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.782 226890 DEBUG nova.virt.libvirt.vif [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1094021331',display_name='tempest-DeleteServersTestJSON-server-1094021331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1094021331',id=86,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-rpusmqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:56Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=e9dec9ac-568f-4ce4-a58a-351e3b2fff52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.783 226890 DEBUG nova.network.os_vif_util [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.784 226890 DEBUG nova.network.os_vif_util [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.784 226890 DEBUG os_vif [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.786 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.787 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.791 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd222354f-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.792 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd222354f-f1, col_values=(('external_ids', {'iface-id': 'd222354f-f133-42a4-aaf5-a102e641821a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:d0:29', 'vm-uuid': 'e9dec9ac-568f-4ce4-a58a-351e3b2fff52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:01 np0005588920 NetworkManager[49076]: <info>  [1768920241.7948] manager: (tapd222354f-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.794 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.798 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.803 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.804 226890 INFO os_vif [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1')#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.877 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.877 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.877 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No VIF found with MAC fa:16:3e:a7:d0:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.878 226890 INFO nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Using config drive#033[00m
Jan 20 09:44:01 np0005588920 nova_compute[226886]: 2026-01-20 14:44:01.900 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:02.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.377 226890 INFO nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Creating config drive at /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.381 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0l7a2l4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.509 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0l7a2l4" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.541 226890 DEBUG nova.storage.rbd_utils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.546 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.751 226890 DEBUG oslo_concurrency.processutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config e9dec9ac-568f-4ce4-a58a-351e3b2fff52_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.753 226890 INFO nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Deleting local config drive /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:02 np0005588920 kernel: tapd222354f-f1: entered promiscuous mode
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.819 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:02 np0005588920 NetworkManager[49076]: <info>  [1768920242.8216] manager: (tapd222354f-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Jan 20 09:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:02Z|00324|binding|INFO|Claiming lport d222354f-f133-42a4-aaf5-a102e641821a for this chassis.
Jan 20 09:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:02Z|00325|binding|INFO|d222354f-f133-42a4-aaf5-a102e641821a: Claiming fa:16:3e:a7:d0:29 10.100.0.12
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.837 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:d0:29 10.100.0.12'], port_security=['fa:16:3e:a7:d0:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e9dec9ac-568f-4ce4-a58a-351e3b2fff52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d222354f-f133-42a4-aaf5-a102e641821a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.839 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d222354f-f133-42a4-aaf5-a102e641821a in datapath fbd5d614-a7d3-4563-913c-104506628e59 bound to our chassis#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.840 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd5d614-a7d3-4563-913c-104506628e59#033[00m
Jan 20 09:44:02 np0005588920 systemd-udevd[256489]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:02 np0005588920 systemd-machined[196121]: New machine qemu-35-instance-00000056.
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.855 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7c775c-8199-44b6-82df-8930f1a06d25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.856 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbd5d614-a1 in ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.859 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbd5d614-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.859 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e01c9f-73d3-4c52-995a-9fe76ca3fc3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.860 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[49f29978-56e3-4827-9d47-049af6e6f05e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 NetworkManager[49076]: <info>  [1768920242.8653] device (tapd222354f-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:02 np0005588920 NetworkManager[49076]: <info>  [1768920242.8664] device (tapd222354f-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:02 np0005588920 systemd[1]: Started Virtual Machine qemu-35-instance-00000056.
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.870 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0482fe4e-5f05-4fb2-a801-e940f83212f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.897 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[07b2873c-5fa6-4a34-9ffc-a72e034ee731]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.898 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.908 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.926 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8a14b41c-830e-49be-99f8-12dd5d27f070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.934 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5a5618-e048-4693-981c-bb0a8a47ebf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 systemd-udevd[256493]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:02 np0005588920 NetworkManager[49076]: <info>  [1768920242.9468] manager: (tapfbd5d614-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Jan 20 09:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:02Z|00326|binding|INFO|Setting lport d222354f-f133-42a4-aaf5-a102e641821a ovn-installed in OVS
Jan 20 09:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:02Z|00327|binding|INFO|Setting lport d222354f-f133-42a4-aaf5-a102e641821a up in Southbound
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.969 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[25feda91-5ed4-4790-a39e-a0e91ee44ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 nova_compute[226886]: 2026-01-20 14:44:02.971 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:02.972 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3d32822b-6287-4b00-b475-07ce8e847737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:02 np0005588920 NetworkManager[49076]: <info>  [1768920242.9932] device (tapfbd5d614-a0): carrier: link connected
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.000 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5e61854f-7a27-4859-8f64-7b2c484e0267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.015 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8e11ad93-4523-493e-977c-62b6f8de7134]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528467, 'reachable_time': 40779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256522, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.034 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b9341d65-0625-4807-a2c1-1e2795916670]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:38be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528467, 'tstamp': 528467}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256523, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.049 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[01ae6a9e-f1df-4b49-bc22-f51469886ee1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528467, 'reachable_time': 40779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256524, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.079 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc823a7-6bf7-4e72-b0b9-954ba911ac8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.152 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2894e4-2266-4fcb-a710-2f8b472d0bcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.153 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.153 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.154 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd5d614-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:03 np0005588920 kernel: tapfbd5d614-a0: entered promiscuous mode
Jan 20 09:44:03 np0005588920 NetworkManager[49076]: <info>  [1768920243.1564] manager: (tapfbd5d614-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.155 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.159 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd5d614-a0, col_values=(('external_ids', {'iface-id': 'b370b74e-dca0-4ff7-a96f-85b392e20721'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:03Z|00328|binding|INFO|Releasing lport b370b74e-dca0-4ff7-a96f-85b392e20721 from this chassis (sb_readonly=0)
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.161 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.162 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c152b741-171b-4a01-b527-654c849c12d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.163 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:03.163 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'env', 'PROCESS_TAG=haproxy-fbd5d614-a7d3-4563-913c-104506628e59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbd5d614-a7d3-4563-913c-104506628e59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.174 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:44:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.325 226890 DEBUG nova.compute.manager [req-331bde43-d672-41f8-8297-15cd157c46f5 req-139c004a-63ef-4c5d-8ebe-0c5bbd494db8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.326 226890 DEBUG oslo_concurrency.lockutils [req-331bde43-d672-41f8-8297-15cd157c46f5 req-139c004a-63ef-4c5d-8ebe-0c5bbd494db8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.327 226890 DEBUG oslo_concurrency.lockutils [req-331bde43-d672-41f8-8297-15cd157c46f5 req-139c004a-63ef-4c5d-8ebe-0c5bbd494db8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.327 226890 DEBUG oslo_concurrency.lockutils [req-331bde43-d672-41f8-8297-15cd157c46f5 req-139c004a-63ef-4c5d-8ebe-0c5bbd494db8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.328 226890 DEBUG nova.compute.manager [req-331bde43-d672-41f8-8297-15cd157c46f5 req-139c004a-63ef-4c5d-8ebe-0c5bbd494db8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Processing event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:03.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:03 np0005588920 podman[256604]: 2026-01-20 14:44:03.575399073 +0000 UTC m=+0.060481466 container create 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:44:03 np0005588920 systemd[1]: Started libpod-conmon-39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139.scope.
Jan 20 09:44:03 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:44:03 np0005588920 podman[256604]: 2026-01-20 14:44:03.536503399 +0000 UTC m=+0.021585802 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:03 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19c3c1d69ff97c7b0271f1b5f9a569205c08405894113a942c7d64b43204733e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:03 np0005588920 podman[256604]: 2026-01-20 14:44:03.65167429 +0000 UTC m=+0.136756653 container init 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:44:03 np0005588920 podman[256604]: 2026-01-20 14:44:03.656243277 +0000 UTC m=+0.141325640 container start 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:03 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [NOTICE]   (256623) : New worker (256625) forked
Jan 20 09:44:03 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [NOTICE]   (256623) : Loading success.
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.697 226890 DEBUG nova.network.neutron [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Updated VIF entry in instance network info cache for port d222354f-f133-42a4-aaf5-a102e641821a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.698 226890 DEBUG nova.network.neutron [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Updating instance_info_cache with network_info: [{"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:03 np0005588920 nova_compute[226886]: 2026-01-20 14:44:03.728 226890 DEBUG oslo_concurrency.lockutils [req-5aa525c5-2137-40a1-bf94-9d3f101e114b req-36f8ad3c-dd0d-4b0c-b63b-43f8641f0427 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-e9dec9ac-568f-4ce4-a58a-351e3b2fff52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:04.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.241 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920244.2405405, e9dec9ac-568f-4ce4-a58a-351e3b2fff52 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.242 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.244 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.248 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.252 226890 INFO nova.virt.libvirt.driver [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Instance spawned successfully.#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.252 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.277 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.282 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.285 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.285 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.286 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.286 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.286 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.287 226890 DEBUG nova.virt.libvirt.driver [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.329 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.329 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920244.2407703, e9dec9ac-568f-4ce4-a58a-351e3b2fff52 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.329 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.369 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.373 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920244.2473938, e9dec9ac-568f-4ce4-a58a-351e3b2fff52 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.373 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.382 226890 INFO nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Took 8.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.382 226890 DEBUG nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.396 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.403 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.435 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.457 226890 INFO nova.compute.manager [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Took 9.12 seconds to build instance.#033[00m
Jan 20 09:44:04 np0005588920 nova_compute[226886]: 2026-01-20 14:44:04.482 226890 DEBUG oslo_concurrency.lockutils [None req-9f9e9cbd-e735-4f6f-a1a9-a20a1a8b527f 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.465 226890 DEBUG nova.compute.manager [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.465 226890 DEBUG oslo_concurrency.lockutils [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.467 226890 DEBUG oslo_concurrency.lockutils [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.467 226890 DEBUG oslo_concurrency.lockutils [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.468 226890 DEBUG nova.compute.manager [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] No waiting events found dispatching network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:05 np0005588920 nova_compute[226886]: 2026-01-20 14:44:05.469 226890 WARNING nova.compute.manager [req-84e69318-64a6-4de5-b1bc-863053a23940 req-656aee56-5da4-4fc0-b004-66b3cd50bd12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received unexpected event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a for instance with vm_state active and task_state None.#033[00m
Jan 20 09:44:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:06.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.688 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.689 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.690 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.690 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.691 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.692 226890 INFO nova.compute.manager [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Terminating instance#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.694 226890 DEBUG nova.compute.manager [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:44:06 np0005588920 kernel: tapd222354f-f1 (unregistering): left promiscuous mode
Jan 20 09:44:06 np0005588920 NetworkManager[49076]: <info>  [1768920246.7424] device (tapd222354f-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:44:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:06Z|00329|binding|INFO|Releasing lport d222354f-f133-42a4-aaf5-a102e641821a from this chassis (sb_readonly=0)
Jan 20 09:44:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:06Z|00330|binding|INFO|Setting lport d222354f-f133-42a4-aaf5-a102e641821a down in Southbound
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.752 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:06Z|00331|binding|INFO|Removing iface tapd222354f-f1 ovn-installed in OVS
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.755 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:06.762 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:d0:29 10.100.0.12'], port_security=['fa:16:3e:a7:d0:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e9dec9ac-568f-4ce4-a58a-351e3b2fff52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d222354f-f133-42a4-aaf5-a102e641821a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:06.764 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d222354f-f133-42a4-aaf5-a102e641821a in datapath fbd5d614-a7d3-4563-913c-104506628e59 unbound from our chassis#033[00m
Jan 20 09:44:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:06.768 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbd5d614-a7d3-4563-913c-104506628e59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:44:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:06.769 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c61a2fed-aa58-4aa8-bc2f-eadcc0cdb5ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:06.770 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace which is not needed anymore#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.794 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 20 09:44:06 np0005588920 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000056.scope: Consumed 3.631s CPU time.
Jan 20 09:44:06 np0005588920 systemd-machined[196121]: Machine qemu-35-instance-00000056 terminated.
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.923 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [NOTICE]   (256623) : haproxy version is 2.8.14-c23fe91
Jan 20 09:44:06 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [NOTICE]   (256623) : path to executable is /usr/sbin/haproxy
Jan 20 09:44:06 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [WARNING]  (256623) : Exiting Master process...
Jan 20 09:44:06 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [ALERT]    (256623) : Current worker (256625) exited with code 143 (Terminated)
Jan 20 09:44:06 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[256619]: [WARNING]  (256623) : All workers exited. Exiting... (0)
Jan 20 09:44:06 np0005588920 systemd[1]: libpod-39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139.scope: Deactivated successfully.
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.938 226890 INFO nova.virt.libvirt.driver [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Instance destroyed successfully.#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.938 226890 DEBUG nova.objects.instance [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'resources' on Instance uuid e9dec9ac-568f-4ce4-a58a-351e3b2fff52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:06 np0005588920 podman[256700]: 2026-01-20 14:44:06.93943177 +0000 UTC m=+0.055172189 container died 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.963 226890 DEBUG nova.virt.libvirt.vif [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:43:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1094021331',display_name='tempest-DeleteServersTestJSON-server-1094021331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1094021331',id=86,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-rpusmqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:44:04Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=e9dec9ac-568f-4ce4-a58a-351e3b2fff52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.963 226890 DEBUG nova.network.os_vif_util [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "d222354f-f133-42a4-aaf5-a102e641821a", "address": "fa:16:3e:a7:d0:29", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd222354f-f1", "ovs_interfaceid": "d222354f-f133-42a4-aaf5-a102e641821a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.964 226890 DEBUG nova.network.os_vif_util [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.964 226890 DEBUG os_vif [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.966 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.966 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd222354f-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.967 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:06 np0005588920 nova_compute[226886]: 2026-01-20 14:44:06.972 226890 INFO os_vif [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:d0:29,bridge_name='br-int',has_traffic_filtering=True,id=d222354f-f133-42a4-aaf5-a102e641821a,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd222354f-f1')#033[00m
Jan 20 09:44:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139-userdata-shm.mount: Deactivated successfully.
Jan 20 09:44:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay-19c3c1d69ff97c7b0271f1b5f9a569205c08405894113a942c7d64b43204733e-merged.mount: Deactivated successfully.
Jan 20 09:44:06 np0005588920 podman[256700]: 2026-01-20 14:44:06.984937768 +0000 UTC m=+0.100678177 container cleanup 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:44:06 np0005588920 systemd[1]: libpod-conmon-39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139.scope: Deactivated successfully.
Jan 20 09:44:07 np0005588920 podman[256753]: 2026-01-20 14:44:07.045537397 +0000 UTC m=+0.039596074 container remove 39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.053 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4518925b-5a4d-4da4-bc81-0f8656fa466f]: (4, ('Tue Jan 20 02:44:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139)\n39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139\nTue Jan 20 02:44:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139)\n39b6d852d507fee90ed5df0c91a5caaf81e66d1bf32f65709b68393bc9d4c139\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.055 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[989c0a65-c0d7-4259-9f20-0acc7bc62928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.055 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.057 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:07 np0005588920 kernel: tapfbd5d614-a0: left promiscuous mode
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.077 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c94414ae-f19d-4c1e-80ce-c8dfcb1d10b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.099 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d5512c-9594-4088-bbc4-a4b253716738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.100 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[74b3aa8c-0f52-42c7-88af-e37c4a023c7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.119 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6d2f20-64a4-4f96-ba71-71b5cb41c2ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528460, 'reachable_time': 29640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256771, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.121 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:44:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:07.121 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[15032c88-9846-456e-acab-e8ba1017b2b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:07 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfbd5d614\x2da7d3\x2d4563\x2d913c\x2d104506628e59.mount: Deactivated successfully.
Jan 20 09:44:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:07.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.687 226890 DEBUG nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-unplugged-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.688 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.688 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.688 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.689 226890 DEBUG nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] No waiting events found dispatching network-vif-unplugged-d222354f-f133-42a4-aaf5-a102e641821a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.689 226890 DEBUG nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-unplugged-d222354f-f133-42a4-aaf5-a102e641821a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.689 226890 DEBUG nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.689 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.690 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.690 226890 DEBUG oslo_concurrency.lockutils [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.690 226890 DEBUG nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] No waiting events found dispatching network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.691 226890 WARNING nova.compute.manager [req-8e71cb69-bfea-4fb7-8708-ee066fffaf03 req-e7fb4c48-4628-4023-a4d6-6e35581bf198 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received unexpected event network-vif-plugged-d222354f-f133-42a4-aaf5-a102e641821a for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.734 226890 INFO nova.virt.libvirt.driver [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Deleting instance files /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52_del#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.735 226890 INFO nova.virt.libvirt.driver [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Deletion of /var/lib/nova/instances/e9dec9ac-568f-4ce4-a58a-351e3b2fff52_del complete#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.809 226890 INFO nova.compute.manager [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.810 226890 DEBUG oslo.service.loopingcall [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.810 226890 DEBUG nova.compute.manager [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:44:07 np0005588920 nova_compute[226886]: 2026-01-20 14:44:07.810 226890 DEBUG nova.network.neutron [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:44:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:08.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:08 np0005588920 nova_compute[226886]: 2026-01-20 14:44:08.716 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920233.7154505, 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:08 np0005588920 nova_compute[226886]: 2026-01-20 14:44:08.717 226890 INFO nova.compute.manager [-] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:44:08 np0005588920 nova_compute[226886]: 2026-01-20 14:44:08.754 226890 DEBUG nova.compute.manager [None req-a67f033e-a38c-4fe0-a736-c12b81584426 - - - - - -] [instance: 02cb232b-4ecb-41bb-8d5f-4ad6f2e6bae6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:09.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.466 226890 DEBUG nova.network.neutron [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.502 226890 INFO nova.compute.manager [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Took 1.69 seconds to deallocate network for instance.#033[00m
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.626 226890 DEBUG nova.compute.manager [req-225dc67a-2a57-4d6a-b5e2-d9f4f44abbda req-43321944-788d-4032-989c-a1171ae4169d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Received event network-vif-deleted-d222354f-f133-42a4-aaf5-a102e641821a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.634 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.634 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:09 np0005588920 nova_compute[226886]: 2026-01-20 14:44:09.694 226890 DEBUG oslo_concurrency.processutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1216611789' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.112 226890 DEBUG oslo_concurrency.processutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.119 226890 DEBUG nova.compute.provider_tree [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.139 226890 DEBUG nova.scheduler.client.report [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:10.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.182 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.233 226890 INFO nova.scheduler.client.report [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Deleted allocations for instance e9dec9ac-568f-4ce4-a58a-351e3b2fff52#033[00m
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.304 226890 DEBUG oslo_concurrency.lockutils [None req-16cb69a8-a9f0-4400-bb72-2d88f1f22b99 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "e9dec9ac-568f-4ce4-a58a-351e3b2fff52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:10 np0005588920 nova_compute[226886]: 2026-01-20 14:44:10.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:11.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:11 np0005588920 nova_compute[226886]: 2026-01-20 14:44:11.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:12.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:12 np0005588920 podman[256796]: 2026-01-20 14:44:12.991768314 +0000 UTC m=+0.052204236 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:13.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:44:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1650611202' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:44:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:44:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1650611202' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:44:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:14.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:15.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.560 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.560 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.595 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.694 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.695 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.700 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.700 226890 INFO nova.compute.claims [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:44:15 np0005588920 nova_compute[226886]: 2026-01-20 14:44:15.883 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:16.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2953342997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.333 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.340 226890 DEBUG nova.compute.provider_tree [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.368 226890 DEBUG nova.scheduler.client.report [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.392 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.393 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.444 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.445 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:16.447 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:16.448 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:16.449 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.465 226890 INFO nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.491 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.623 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.625 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.626 226890 INFO nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Creating image(s)#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.661 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.700 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.730 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.733 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.800 226890 DEBUG nova.policy [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37e9ef97fbe0448e9fbe32d48b66211f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b31139b2a4e49cba5e7048febf901c4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.806 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.807 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.808 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.808 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.842 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.847 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c3c7df3a-49bb-4ca6-a517-e560cf730181_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:16.950 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:16.952 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.952 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588920 nova_compute[226886]: 2026-01-20 14:44:16.972 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.345 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c3c7df3a-49bb-4ca6-a517-e560cf730181_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:17.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.414 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] resizing rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.620 226890 DEBUG nova.objects.instance [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'migration_context' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.641 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.642 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Ensure instance console log exists: /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.642 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.643 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.643 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:17 np0005588920 nova_compute[226886]: 2026-01-20 14:44:17.738 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Successfully created port: 778b7086-528a-4216-8236-277193c5e77c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:44:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:18.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.344 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Successfully updated port: 778b7086-528a-4216-8236-277193c5e77c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.381 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.381 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquired lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.382 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:44:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:19.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.473 226890 DEBUG nova.compute.manager [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-changed-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.474 226890 DEBUG nova.compute.manager [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Refreshing instance network info cache due to event network-changed-778b7086-528a-4216-8236-277193c5e77c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.475 226890 DEBUG oslo_concurrency.lockutils [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:19 np0005588920 nova_compute[226886]: 2026-01-20 14:44:19.606 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:44:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:20.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:20 np0005588920 nova_compute[226886]: 2026-01-20 14:44:20.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.074 226890 DEBUG nova.network.neutron [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updating instance_info_cache with network_info: [{"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.134 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Releasing lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.134 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Instance network_info: |[{"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.135 226890 DEBUG oslo_concurrency.lockutils [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.135 226890 DEBUG nova.network.neutron [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Refreshing network info cache for port 778b7086-528a-4216-8236-277193c5e77c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.138 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Start _get_guest_xml network_info=[{"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.142 226890 WARNING nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.147 226890 DEBUG nova.virt.libvirt.host [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.148 226890 DEBUG nova.virt.libvirt.host [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.155 226890 DEBUG nova.virt.libvirt.host [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.156 226890 DEBUG nova.virt.libvirt.host [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.157 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.157 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.158 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.158 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.158 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.158 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.158 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.159 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.159 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.159 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.159 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.160 226890 DEBUG nova.virt.hardware [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.162 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:21.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/805797725' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.616 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.645 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.650 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.935 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920246.9342616, e9dec9ac-568f-4ce4-a58a-351e3b2fff52 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.936 226890 INFO nova.compute.manager [-] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.973 226890 DEBUG nova.compute.manager [None req-ad8a017c-a09b-48ba-a188-4de883bd0a74 - - - - - -] [instance: e9dec9ac-568f-4ce4-a58a-351e3b2fff52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:21 np0005588920 nova_compute[226886]: 2026-01-20 14:44:21.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/723143886' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.092 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.094 226890 DEBUG nova.virt.libvirt.vif [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1057408982',display_name='tempest-DeleteServersTestJSON-server-1057408982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1057408982',id=88,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-dh6e9rzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:16Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=c3c7df3a-49bb-4ca6-a517-e560cf730181,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.094 226890 DEBUG nova.network.os_vif_util [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.095 226890 DEBUG nova.network.os_vif_util [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.096 226890 DEBUG nova.objects.instance [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.120 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <uuid>c3c7df3a-49bb-4ca6-a517-e560cf730181</uuid>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <name>instance-00000058</name>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersTestJSON-server-1057408982</nova:name>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:44:21</nova:creationTime>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:user uuid="37e9ef97fbe0448e9fbe32d48b66211f">tempest-DeleteServersTestJSON-1162922273-project-member</nova:user>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:project uuid="3b31139b2a4e49cba5e7048febf901c4">tempest-DeleteServersTestJSON-1162922273</nova:project>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <nova:port uuid="778b7086-528a-4216-8236-277193c5e77c">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="serial">c3c7df3a-49bb-4ca6-a517-e560cf730181</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="uuid">c3c7df3a-49bb-4ca6-a517-e560cf730181</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c3c7df3a-49bb-4ca6-a517-e560cf730181_disk">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:70:ac:01"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <target dev="tap778b7086-52"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/console.log" append="off"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:44:22 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:44:22 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:44:22 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:44:22 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.121 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Preparing to wait for external event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.122 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.122 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.122 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.123 226890 DEBUG nova.virt.libvirt.vif [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1057408982',display_name='tempest-DeleteServersTestJSON-server-1057408982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1057408982',id=88,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-dh6e9rzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:16Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=c3c7df3a-49bb-4ca6-a517-e560cf730181,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.123 226890 DEBUG nova.network.os_vif_util [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.124 226890 DEBUG nova.network.os_vif_util [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.124 226890 DEBUG os_vif [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.124 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.125 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.125 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.127 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.128 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap778b7086-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.128 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap778b7086-52, col_values=(('external_ids', {'iface-id': '778b7086-528a-4216-8236-277193c5e77c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:ac:01', 'vm-uuid': 'c3c7df3a-49bb-4ca6-a517-e560cf730181'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.129 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:22 np0005588920 NetworkManager[49076]: <info>  [1768920262.1308] manager: (tap778b7086-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.133 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.136 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.136 226890 INFO os_vif [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52')#033[00m
Jan 20 09:44:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:22.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.196 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.197 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.197 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No VIF found with MAC fa:16:3e:70:ac:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.198 226890 INFO nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Using config drive#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.225 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.718 226890 INFO nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Creating config drive at /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.723 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5m565axy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.850 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5m565axy" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.878 226890 DEBUG nova.storage.rbd_utils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:22 np0005588920 nova_compute[226886]: 2026-01-20 14:44:22.881 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.007 226890 DEBUG oslo_concurrency.processutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config c3c7df3a-49bb-4ca6-a517-e560cf730181_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.008 226890 INFO nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Deleting local config drive /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:23 np0005588920 kernel: tap778b7086-52: entered promiscuous mode
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.0613] manager: (tap778b7086-52): new Tun device (/org/freedesktop/NetworkManager/Devices/174)
Jan 20 09:44:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:23Z|00332|binding|INFO|Claiming lport 778b7086-528a-4216-8236-277193c5e77c for this chassis.
Jan 20 09:44:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:23Z|00333|binding|INFO|778b7086-528a-4216-8236-277193c5e77c: Claiming fa:16:3e:70:ac:01 10.100.0.14
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.068 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:ac:01 10.100.0.14'], port_security=['fa:16:3e:70:ac:01 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c3c7df3a-49bb-4ca6-a517-e560cf730181', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=778b7086-528a-4216-8236-277193c5e77c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.069 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 778b7086-528a-4216-8236-277193c5e77c in datapath fbd5d614-a7d3-4563-913c-104506628e59 bound to our chassis#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.070 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd5d614-a7d3-4563-913c-104506628e59#033[00m
Jan 20 09:44:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:23Z|00334|binding|INFO|Setting lport 778b7086-528a-4216-8236-277193c5e77c ovn-installed in OVS
Jan 20 09:44:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:23Z|00335|binding|INFO|Setting lport 778b7086-528a-4216-8236-277193c5e77c up in Southbound
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.077 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.082 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78b3f2ba-726b-4f81-b985-72b7d60867af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.083 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbd5d614-a1 in ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.084 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbd5d614-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.084 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[df5b0285-04ee-4969-875a-23b982d5ca24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.085 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2abdd797-786c-4260-8110-dcc6d78054d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 systemd-udevd[257140]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:23 np0005588920 systemd-machined[196121]: New machine qemu-36-instance-00000058.
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.095 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[aad3b353-0e5a-46a3-8b00-6b9a7f58cdae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.1043] device (tap778b7086-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.1052] device (tap778b7086-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:23 np0005588920 systemd[1]: Started Virtual Machine qemu-36-instance-00000058.
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.111 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc2d0cc-0bff-4f7a-80da-863c800c8c65]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.139 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd3241b-6844-4d76-a7f1-d7b89b3c115e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.143 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[40b56e9e-3f12-4ade-8722-5844ccd8deb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.1446] manager: (tapfbd5d614-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/175)
Jan 20 09:44:23 np0005588920 systemd-udevd[257143]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.177 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[42de0d61-c1f5-4c69-933f-0042e185cb51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.180 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[46fce4ab-9fc9-4b02-b4ba-5d36f83e7789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.2013] device (tapfbd5d614-a0): carrier: link connected
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.205 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[68e208a4-c984-4846-9591-38b0125d2fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.222 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3989dcbc-e3fd-42ce-98f8-86b1c12259d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530488, 'reachable_time': 23691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257171, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.236 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d23d34-ea0c-43be-9c6a-550a9ed11bd9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:38be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530488, 'tstamp': 530488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257172, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.249 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[11ac9478-2d5f-45d4-807b-11d5c9a50807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530488, 'reachable_time': 23691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257173, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.276 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[90692bd4-fa50-4ac6-8317-034db41b7b3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.327 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f3204605-9868-4a92-a6a6-652d869ac439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.328 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.329 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.329 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd5d614-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.330 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588920 NetworkManager[49076]: <info>  [1768920263.3315] manager: (tapfbd5d614-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 20 09:44:23 np0005588920 kernel: tapfbd5d614-a0: entered promiscuous mode
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.334 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd5d614-a0, col_values=(('external_ids', {'iface-id': 'b370b74e-dca0-4ff7-a96f-85b392e20721'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:23Z|00336|binding|INFO|Releasing lport b370b74e-dca0-4ff7-a96f-85b392e20721 from this chassis (sb_readonly=0)
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.357 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.358 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[311bee7b-92c2-4862-9ca1-b999df6903d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.359 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:23.360 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'env', 'PROCESS_TAG=haproxy-fbd5d614-a7d3-4563-913c-104506628e59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbd5d614-a7d3-4563-913c-104506628e59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:23.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.668 226890 DEBUG nova.network.neutron [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updated VIF entry in instance network info cache for port 778b7086-528a-4216-8236-277193c5e77c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.669 226890 DEBUG nova.network.neutron [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updating instance_info_cache with network_info: [{"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.681 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920263.6810696, c3c7df3a-49bb-4ca6-a517-e560cf730181 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.682 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.688 226890 DEBUG oslo_concurrency.lockutils [req-13aac63c-cf9d-4098-9e3d-c90538c7e77c req-6b43c972-ff1e-407d-9af9-cfe3c5645043 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.703 226890 DEBUG nova.compute.manager [req-03cf3b2e-c7d1-4dc2-a14c-356a41171e89 req-2d3dbab5-a786-4e07-a81c-eaaf3161b93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.704 226890 DEBUG oslo_concurrency.lockutils [req-03cf3b2e-c7d1-4dc2-a14c-356a41171e89 req-2d3dbab5-a786-4e07-a81c-eaaf3161b93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.704 226890 DEBUG oslo_concurrency.lockutils [req-03cf3b2e-c7d1-4dc2-a14c-356a41171e89 req-2d3dbab5-a786-4e07-a81c-eaaf3161b93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.705 226890 DEBUG oslo_concurrency.lockutils [req-03cf3b2e-c7d1-4dc2-a14c-356a41171e89 req-2d3dbab5-a786-4e07-a81c-eaaf3161b93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.705 226890 DEBUG nova.compute.manager [req-03cf3b2e-c7d1-4dc2-a14c-356a41171e89 req-2d3dbab5-a786-4e07-a81c-eaaf3161b93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Processing event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.706 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.712 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.715 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.718 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.721 226890 INFO nova.virt.libvirt.driver [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Instance spawned successfully.#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.722 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:23 np0005588920 podman[257247]: 2026-01-20 14:44:23.73088717 +0000 UTC m=+0.055770616 container create f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.752 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.753 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920263.6814227, c3c7df3a-49bb-4ca6-a517-e560cf730181 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.754 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.759 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.760 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.760 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.761 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.761 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.762 226890 DEBUG nova.virt.libvirt.driver [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:23 np0005588920 systemd[1]: Started libpod-conmon-f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775.scope.
Jan 20 09:44:23 np0005588920 podman[257247]: 2026-01-20 14:44:23.697969382 +0000 UTC m=+0.022852868 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:23 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.805 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8253751ca0941cc225e487e6eda8b96813d3da912956c9b9dd0c51413efd7e2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.817 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920263.7099657, c3c7df3a-49bb-4ca6-a517-e560cf730181 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.818 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:23 np0005588920 podman[257247]: 2026-01-20 14:44:23.824667024 +0000 UTC m=+0.149550460 container init f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.826 226890 INFO nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Took 7.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.827 226890 DEBUG nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:23 np0005588920 podman[257247]: 2026-01-20 14:44:23.829869159 +0000 UTC m=+0.154752595 container start f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:23 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [NOTICE]   (257266) : New worker (257268) forked
Jan 20 09:44:23 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [NOTICE]   (257266) : Loading success.
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.852 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.855 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.893 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.927 226890 INFO nova.compute.manager [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Took 8.27 seconds to build instance.#033[00m
Jan 20 09:44:23 np0005588920 nova_compute[226886]: 2026-01-20 14:44:23.949 226890 DEBUG oslo_concurrency.lockutils [None req-70fa294a-6075-45c1-a371-66f3052d3023 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:24.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:24 np0005588920 nova_compute[226886]: 2026-01-20 14:44:24.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:24 np0005588920 nova_compute[226886]: 2026-01-20 14:44:24.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:44:24 np0005588920 nova_compute[226886]: 2026-01-20 14:44:24.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.010 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.011 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.011 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.011 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:25.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.888 226890 DEBUG nova.compute.manager [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.889 226890 DEBUG oslo_concurrency.lockutils [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.890 226890 DEBUG oslo_concurrency.lockutils [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.891 226890 DEBUG oslo_concurrency.lockutils [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.892 226890 DEBUG nova.compute.manager [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] No waiting events found dispatching network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:25 np0005588920 nova_compute[226886]: 2026-01-20 14:44:25.892 226890 WARNING nova.compute.manager [req-5ba7373a-7af8-4e79-aaf1-fb78d93e5b14 req-afbaf772-717b-455c-843f-c4011d268377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received unexpected event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c for instance with vm_state active and task_state None.#033[00m
Jan 20 09:44:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:25.955 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:26.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:26 np0005588920 nova_compute[226886]: 2026-01-20 14:44:26.978 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updating instance_info_cache with network_info: [{"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:27 np0005588920 nova_compute[226886]: 2026-01-20 14:44:27.033 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-c3c7df3a-49bb-4ca6-a517-e560cf730181" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:27 np0005588920 nova_compute[226886]: 2026-01-20 14:44:27.034 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:44:27 np0005588920 nova_compute[226886]: 2026-01-20 14:44:27.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:27.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:28.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.019 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.020 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.034 226890 DEBUG nova.objects.instance [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'flavor' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.083 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.388 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.389 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.390 226890 INFO nova.compute.manager [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Attaching volume 5930261f-9813-4b24-a25f-b6b26b90d24b to /dev/vdb#033[00m
Jan 20 09:44:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:29.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.575 226890 DEBUG os_brick.utils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.577 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.594 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.595 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f379926f-52a8-47f1-90f7-3b095c542138]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.598 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.606 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.607 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d6c2a2-1aaf-4bec-9ca9-3289e6700044]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.609 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.620 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.620 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8b1662-543e-4141-a60c-51fafb8c630f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.622 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5d1728-009f-41a2-b35b-bd09ad113d62]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.623 226890 DEBUG oslo_concurrency.processutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.645 226890 DEBUG oslo_concurrency.processutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.647 226890 DEBUG os_brick.initiator.connectors.lightos [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.648 226890 DEBUG os_brick.initiator.connectors.lightos [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.648 226890 DEBUG os_brick.initiator.connectors.lightos [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.648 226890 DEBUG os_brick.utils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] <== get_connector_properties: return (72ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.649 226890 DEBUG nova.virt.block_device [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updating existing volume attachment record: 53649e5b-5992-49a4-99de-679629482b83 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:44:29 np0005588920 nova_compute[226886]: 2026-01-20 14:44:29.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:30.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.393 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.511 226890 DEBUG nova.objects.instance [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'flavor' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.535 226890 DEBUG nova.virt.libvirt.driver [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Attempting to attach volume 5930261f-9813-4b24-a25f-b6b26b90d24b with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.538 226890 DEBUG nova.virt.libvirt.guest [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-5930261f-9813-4b24-a25f-b6b26b90d24b">
Jan 20 09:44:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 09:44:30 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  </auth>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:44:30 np0005588920 nova_compute[226886]:  <serial>5930261f-9813-4b24-a25f-b6b26b90d24b</serial>
Jan 20 09:44:30 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:44:30 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.858 226890 DEBUG nova.virt.libvirt.driver [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.858 226890 DEBUG nova.virt.libvirt.driver [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.859 226890 DEBUG nova.virt.libvirt.driver [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:30 np0005588920 nova_compute[226886]: 2026-01-20 14:44:30.859 226890 DEBUG nova.virt.libvirt.driver [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No VIF found with MAC fa:16:3e:70:ac:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:31 np0005588920 nova_compute[226886]: 2026-01-20 14:44:31.064 226890 DEBUG oslo_concurrency.lockutils [None req-42dd1f30-9d78-4088-908a-aab6e422020c 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:31.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:31 np0005588920 nova_compute[226886]: 2026-01-20 14:44:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:31 np0005588920 podman[257304]: 2026-01-20 14:44:31.997024687 +0000 UTC m=+0.084118846 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.069 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.070 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.070 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.070 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.071 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.072 226890 INFO nova.compute.manager [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Terminating instance#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.073 226890 DEBUG nova.compute.manager [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.133 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:32.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:32 np0005588920 kernel: tap778b7086-52 (unregistering): left promiscuous mode
Jan 20 09:44:32 np0005588920 NetworkManager[49076]: <info>  [1768920272.7275] device (tap778b7086-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.735 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:32Z|00337|binding|INFO|Releasing lport 778b7086-528a-4216-8236-277193c5e77c from this chassis (sb_readonly=0)
Jan 20 09:44:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:32Z|00338|binding|INFO|Setting lport 778b7086-528a-4216-8236-277193c5e77c down in Southbound
Jan 20 09:44:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:32Z|00339|binding|INFO|Removing iface tap778b7086-52 ovn-installed in OVS
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.738 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:32.745 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:ac:01 10.100.0.14'], port_security=['fa:16:3e:70:ac:01 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c3c7df3a-49bb-4ca6-a517-e560cf730181', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=778b7086-528a-4216-8236-277193c5e77c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:32.746 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 778b7086-528a-4216-8236-277193c5e77c in datapath fbd5d614-a7d3-4563-913c-104506628e59 unbound from our chassis#033[00m
Jan 20 09:44:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:32.747 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbd5d614-a7d3-4563-913c-104506628e59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:44:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:32.748 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d4cf5314-b5c5-4e2d-9e11-6704526035e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:32.749 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace which is not needed anymore#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.761 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.772 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 20 09:44:32 np0005588920 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000058.scope: Consumed 9.020s CPU time.
Jan 20 09:44:32 np0005588920 systemd-machined[196121]: Machine qemu-36-instance-00000058 terminated.
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.797 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.797 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.798 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.798 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.798 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.916 226890 INFO nova.virt.libvirt.driver [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Instance destroyed successfully.#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.917 226890 DEBUG nova.objects.instance [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'resources' on Instance uuid c3c7df3a-49bb-4ca6-a517-e560cf730181 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.938 226890 DEBUG nova.virt.libvirt.vif [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:44:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1057408982',display_name='tempest-DeleteServersTestJSON-server-1057408982',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1057408982',id=88,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-dh6e9rzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:44:23Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=c3c7df3a-49bb-4ca6-a517-e560cf730181,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.939 226890 DEBUG nova.network.os_vif_util [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "778b7086-528a-4216-8236-277193c5e77c", "address": "fa:16:3e:70:ac:01", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778b7086-52", "ovs_interfaceid": "778b7086-528a-4216-8236-277193c5e77c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.940 226890 DEBUG nova.network.os_vif_util [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.940 226890 DEBUG os_vif [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.942 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.942 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap778b7086-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.944 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.946 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:32 np0005588920 nova_compute[226886]: 2026-01-20 14:44:32.948 226890 INFO os_vif [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:ac:01,bridge_name='br-int',has_traffic_filtering=True,id=778b7086-528a-4216-8236-277193c5e77c,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778b7086-52')#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.076 226890 DEBUG nova.compute.manager [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-unplugged-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.081 226890 DEBUG oslo_concurrency.lockutils [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.082 226890 DEBUG oslo_concurrency.lockutils [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.082 226890 DEBUG oslo_concurrency.lockutils [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.082 226890 DEBUG nova.compute.manager [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] No waiting events found dispatching network-vif-unplugged-778b7086-528a-4216-8236-277193c5e77c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.083 226890 DEBUG nova.compute.manager [req-5bb30f36-5471-4f65-8e9d-80fc50e94fb3 req-7dde5a1a-ea82-495b-ae96-a0c8d7f6a999 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-unplugged-778b7086-528a-4216-8236-277193c5e77c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:44:33 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [NOTICE]   (257266) : haproxy version is 2.8.14-c23fe91
Jan 20 09:44:33 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [NOTICE]   (257266) : path to executable is /usr/sbin/haproxy
Jan 20 09:44:33 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [WARNING]  (257266) : Exiting Master process...
Jan 20 09:44:33 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [ALERT]    (257266) : Current worker (257268) exited with code 143 (Terminated)
Jan 20 09:44:33 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[257262]: [WARNING]  (257266) : All workers exited. Exiting... (0)
Jan 20 09:44:33 np0005588920 systemd[1]: libpod-f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775.scope: Deactivated successfully.
Jan 20 09:44:33 np0005588920 conmon[257262]: conmon f9bfbabad1bbacdd0275 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775.scope/container/memory.events
Jan 20 09:44:33 np0005588920 podman[257355]: 2026-01-20 14:44:33.154779391 +0000 UTC m=+0.317529993 container died f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.255 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:33 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775-userdata-shm.mount: Deactivated successfully.
Jan 20 09:44:33 np0005588920 systemd[1]: var-lib-containers-storage-overlay-8253751ca0941cc225e487e6eda8b96813d3da912956c9b9dd0c51413efd7e2d-merged.mount: Deactivated successfully.
Jan 20 09:44:33 np0005588920 podman[257355]: 2026-01-20 14:44:33.301281534 +0000 UTC m=+0.464032136 container cleanup f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:33 np0005588920 systemd[1]: libpod-conmon-f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775.scope: Deactivated successfully.
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.326 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000058 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.326 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000058 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.326 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000058 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:44:33 np0005588920 podman[257437]: 2026-01-20 14:44:33.36531841 +0000 UTC m=+0.043352280 container remove f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.372 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[289c3e82-aa8d-4f8d-a33e-c9f63a62b61f]: (4, ('Tue Jan 20 02:44:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775)\nf9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775\nTue Jan 20 02:44:33 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (f9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775)\nf9bfbabad1bbacdd0275bb139ca61f7eac931279a1c64dda0e68f86dd89ae775\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.374 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8da7fbdf-062b-4a86-8cd2-f8a4d3766b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.375 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:33 np0005588920 kernel: tapfbd5d614-a0: left promiscuous mode
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.392 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.395 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d120deef-2767-434b-9f41-6468dcd999fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.412 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f89b6af-fcf3-4adc-9b75-08b904b04ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.413 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[356f9fa5-53b8-4020-8b7a-4c728fc68fe5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:33.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.428 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3b4a0cef-f98e-4924-bf05-da4347005163]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530482, 'reachable_time': 41820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257453, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.431 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:44:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:33.431 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd7f7e6-4581-4fde-b7bc-5672f4b09076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:33 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfbd5d614\x2da7d3\x2d4563\x2d913c\x2d104506628e59.mount: Deactivated successfully.
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.484 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.484 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4558MB free_disk=20.92196273803711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.485 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.485 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.563 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance c3c7df3a-49bb-4ca6-a517-e560cf730181 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.564 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.564 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:44:33 np0005588920 nova_compute[226886]: 2026-01-20 14:44:33.608 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/899748910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:34 np0005588920 nova_compute[226886]: 2026-01-20 14:44:34.121 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:34 np0005588920 nova_compute[226886]: 2026-01-20 14:44:34.128 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:34 np0005588920 nova_compute[226886]: 2026-01-20 14:44:34.147 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:34 np0005588920 nova_compute[226886]: 2026-01-20 14:44:34.171 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:44:34 np0005588920 nova_compute[226886]: 2026-01-20 14:44:34.172 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:34.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.136 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.136 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.136 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.168 226890 DEBUG nova.compute.manager [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.168 226890 DEBUG oslo_concurrency.lockutils [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.168 226890 DEBUG oslo_concurrency.lockutils [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.169 226890 DEBUG oslo_concurrency.lockutils [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.169 226890 DEBUG nova.compute.manager [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] No waiting events found dispatching network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.169 226890 WARNING nova.compute.manager [req-8429e876-d8dd-4a62-a3b3-0e605683986c req-53691fa4-bbcb-4557-a992-bb8f72df58e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received unexpected event network-vif-plugged-778b7086-528a-4216-8236-277193c5e77c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.281 226890 INFO nova.virt.libvirt.driver [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Deleting instance files /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181_del#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.282 226890 INFO nova.virt.libvirt.driver [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Deletion of /var/lib/nova/instances/c3c7df3a-49bb-4ca6-a517-e560cf730181_del complete#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.335 226890 INFO nova.compute.manager [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Took 3.26 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.336 226890 DEBUG oslo.service.loopingcall [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.336 226890 DEBUG nova.compute.manager [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.337 226890 DEBUG nova.network.neutron [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:44:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:35.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:35 np0005588920 nova_compute[226886]: 2026-01-20 14:44:35.433 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.011 226890 DEBUG nova.network.neutron [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.028 226890 INFO nova.compute.manager [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Took 0.69 seconds to deallocate network for instance.#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.120 226890 DEBUG nova.compute.manager [req-c7f00a0e-56ca-4d64-b516-a9d0dc25d030 req-42e42759-26fb-45a0-8f3c-8d97b416d4cd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Received event network-vif-deleted-778b7086-528a-4216-8236-277193c5e77c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:36.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.271 226890 INFO nova.compute.manager [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Took 0.24 seconds to detach 1 volumes for instance.#033[00m
Jan 20 09:44:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.351 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.351 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.409 226890 DEBUG oslo_concurrency.processutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4240458191' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.851 226890 DEBUG oslo_concurrency.processutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.857 226890 DEBUG nova.compute.provider_tree [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.881 226890 DEBUG nova.scheduler.client.report [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.899 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.919 226890 INFO nova.scheduler.client.report [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Deleted allocations for instance c3c7df3a-49bb-4ca6-a517-e560cf730181#033[00m
Jan 20 09:44:36 np0005588920 nova_compute[226886]: 2026-01-20 14:44:36.969 226890 DEBUG oslo_concurrency.lockutils [None req-8d73d288-205f-484b-a73a-2995a5185f72 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "c3c7df3a-49bb-4ca6-a517-e560cf730181" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:37.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.772 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.773 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.788 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.851 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.852 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.859 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.859 226890 INFO nova.compute.claims [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.947 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:37 np0005588920 nova_compute[226886]: 2026-01-20 14:44:37.964 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:38.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1059237412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.421 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.428 226890 DEBUG nova.compute.provider_tree [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.443 226890 DEBUG nova.scheduler.client.report [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.460 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.461 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.515 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.515 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.537 226890 INFO nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.565 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.680 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.681 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.682 226890 INFO nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating image(s)#033[00m
Jan 20 09:44:38 np0005588920 nova_compute[226886]: 2026-01-20 14:44:38.782 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.008 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.039 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.044 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.072 226890 DEBUG nova.policy [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '869086208e10436c9dc96c78bee9a85d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b683fcc0026242e28ba6d8fba638688e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.128 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.129 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.129 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.130 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.158 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.163 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cc7de61a-b40f-4367-873d-c51b6f29310b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:39.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:39 np0005588920 nova_compute[226886]: 2026-01-20 14:44:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:40.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.292 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cc7de61a-b40f-4367-873d-c51b6f29310b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.361 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.453 226890 DEBUG nova.objects.instance [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.474 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.474 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Ensure instance console log exists: /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.475 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.475 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.475 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:40 np0005588920 nova_compute[226886]: 2026-01-20 14:44:40.659 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Successfully created port: 4de545f7-326a-4971-87cd-a23be2cbce6a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:44:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:41.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.912 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Successfully updated port: 4de545f7-326a-4971-87cd-a23be2cbce6a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.936 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.937 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.937 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.997 226890 DEBUG nova.compute.manager [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-changed-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.998 226890 DEBUG nova.compute.manager [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Refreshing instance network info cache due to event network-changed-4de545f7-326a-4971-87cd-a23be2cbce6a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:41 np0005588920 nova_compute[226886]: 2026-01-20 14:44:41.998 226890 DEBUG oslo_concurrency.lockutils [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:42.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:42 np0005588920 nova_compute[226886]: 2026-01-20 14:44:42.200 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:44:42 np0005588920 nova_compute[226886]: 2026-01-20 14:44:42.951 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:43.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.009 226890 DEBUG nova.network.neutron [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Updating instance_info_cache with network_info: [{"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:44 np0005588920 podman[257686]: 2026-01-20 14:44:44.018678063 +0000 UTC m=+0.095952466 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.034 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.035 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance network_info: |[{"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.035 226890 DEBUG oslo_concurrency.lockutils [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.035 226890 DEBUG nova.network.neutron [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Refreshing network info cache for port 4de545f7-326a-4971-87cd-a23be2cbce6a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.040 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start _get_guest_xml network_info=[{"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.046 226890 WARNING nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.053 226890 DEBUG nova.virt.libvirt.host [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.054 226890 DEBUG nova.virt.libvirt.host [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.058 226890 DEBUG nova.virt.libvirt.host [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.058 226890 DEBUG nova.virt.libvirt.host [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.059 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.060 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.060 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.060 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.061 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.061 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.061 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.062 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.062 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.062 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.062 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.063 226890 DEBUG nova.virt.hardware [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.066 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:44.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.352 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.353 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.375 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:44:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3502547636' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.577 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.577 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.586 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.586 226890 INFO nova.compute.claims [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.610 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.637 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.641 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:44 np0005588920 nova_compute[226886]: 2026-01-20 14:44:44.797 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/714686278' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.072 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.074 226890 DEBUG nova.virt.libvirt.vif [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:38Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.074 226890 DEBUG nova.network.os_vif_util [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.075 226890 DEBUG nova.network.os_vif_util [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.076 226890 DEBUG nova.objects.instance [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.092 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <uuid>cc7de61a-b40f-4367-873d-c51b6f29310b</uuid>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <name>instance-0000005a</name>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:name>tempest-tempest.common.compute-instance-1867010105</nova:name>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:44:44</nova:creationTime>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <nova:port uuid="4de545f7-326a-4971-87cd-a23be2cbce6a">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="serial">cc7de61a-b40f-4367-873d-c51b6f29310b</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="uuid">cc7de61a-b40f-4367-873d-c51b6f29310b</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cc7de61a-b40f-4367-873d-c51b6f29310b_disk">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:19:9b:8e"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <target dev="tap4de545f7-32"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/console.log" append="off"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:44:45 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:44:45 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:44:45 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:44:45 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.094 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Preparing to wait for external event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.094 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.095 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.095 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.096 226890 DEBUG nova.virt.libvirt.vif [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:38Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.096 226890 DEBUG nova.network.os_vif_util [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.097 226890 DEBUG nova.network.os_vif_util [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.097 226890 DEBUG os_vif [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.098 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.098 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.098 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.102 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.102 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4de545f7-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.102 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4de545f7-32, col_values=(('external_ids', {'iface-id': '4de545f7-326a-4971-87cd-a23be2cbce6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:9b:8e', 'vm-uuid': 'cc7de61a-b40f-4367-873d-c51b6f29310b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.104 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588920 NetworkManager[49076]: <info>  [1768920285.1049] manager: (tap4de545f7-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.110 226890 INFO os_vif [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32')#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.159 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.159 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.160 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:19:9b:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.160 226890 INFO nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Using config drive#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.186 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2544638103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.221 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.226 226890 DEBUG nova.compute.provider_tree [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.239 226890 DEBUG nova.scheduler.client.report [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.263 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.263 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.345 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.346 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.365 226890 INFO nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.380 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.437 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.472 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.473 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.473 226890 INFO nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Creating image(s)#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.495 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.522 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.549 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.552 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.612 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.613 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.613 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.614 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.634 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.637 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.917 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:45 np0005588920 nova_compute[226886]: 2026-01-20 14:44:45.980 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] resizing rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.100 226890 DEBUG nova.objects.instance [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'migration_context' on Instance uuid b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.154 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.154 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Ensure instance console log exists: /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.154 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.155 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.155 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:46.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.575 226890 DEBUG nova.policy [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37e9ef97fbe0448e9fbe32d48b66211f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b31139b2a4e49cba5e7048febf901c4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.632 226890 DEBUG nova.network.neutron [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Updated VIF entry in instance network info cache for port 4de545f7-326a-4971-87cd-a23be2cbce6a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.632 226890 DEBUG nova.network.neutron [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Updating instance_info_cache with network_info: [{"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:46 np0005588920 nova_compute[226886]: 2026-01-20 14:44:46.656 226890 DEBUG oslo_concurrency.lockutils [req-2bdc89a8-cf8c-4961-93ca-52f8579eea03 req-5698b848-5209-4d21-9e9a-f04fe4223483 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cc7de61a-b40f-4367-873d-c51b6f29310b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:47.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.536 226890 INFO nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating config drive at /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.541 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqncpaskv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.675 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqncpaskv" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.706 226890 DEBUG nova.storage.rbd_utils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.709 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.871 226890 DEBUG oslo_concurrency.processutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.872 226890 INFO nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deleting local config drive /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.915 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920272.9139166, c3c7df3a-49bb-4ca6-a517-e560cf730181 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.915 226890 INFO nova.compute.manager [-] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:44:47 np0005588920 kernel: tap4de545f7-32: entered promiscuous mode
Jan 20 09:44:47 np0005588920 NetworkManager[49076]: <info>  [1768920287.9190] manager: (tap4de545f7-32): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Jan 20 09:44:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:47Z|00340|binding|INFO|Claiming lport 4de545f7-326a-4971-87cd-a23be2cbce6a for this chassis.
Jan 20 09:44:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:47Z|00341|binding|INFO|4de545f7-326a-4971-87cd-a23be2cbce6a: Claiming fa:16:3e:19:9b:8e 10.100.0.11
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.920 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:47 np0005588920 NetworkManager[49076]: <info>  [1768920287.9377] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 20 09:44:47 np0005588920 NetworkManager[49076]: <info>  [1768920287.9383] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.938 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:9b:8e 10.100.0.11'], port_security=['fa:16:3e:19:9b:8e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc7de61a-b40f-4367-873d-c51b6f29310b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac411cec-795a-42a6-ba83-9468a87a4a14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de545f7-326a-4971-87cd-a23be2cbce6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.939 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de545f7-326a-4971-87cd-a23be2cbce6a in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.942 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:44:47 np0005588920 systemd-udevd[258027]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:47 np0005588920 nova_compute[226886]: 2026-01-20 14:44:47.943 226890 DEBUG nova.compute.manager [None req-96bdc4d7-98ff-458b-a307-421caae237af - - - - - -] [instance: c3c7df3a-49bb-4ca6-a517-e560cf730181] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.954 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e16e9f6b-d587-4367-9d44-78845a1c29ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.956 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa19e9d1a-81 in ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:47 np0005588920 systemd-machined[196121]: New machine qemu-37-instance-0000005a.
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.959 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa19e9d1a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.959 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6dd1b1-e097-4580-a15b-b0fee520f03c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.960 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3d686ece-c153-40dd-a026-2933c5f3dd2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:47 np0005588920 NetworkManager[49076]: <info>  [1768920287.9616] device (tap4de545f7-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:47 np0005588920 NetworkManager[49076]: <info>  [1768920287.9623] device (tap4de545f7-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.973 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5edb5214-a809-4d6b-9a8c-fecd017b38ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:47 np0005588920 systemd[1]: Started Virtual Machine qemu-37-instance-0000005a.
Jan 20 09:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:47.998 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e44b3a-0fef-44e6-bcd3-3cafba0aa8d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.028 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ed93c2-03c3-4f6e-beea-43ca9da671f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.038 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[20c23e89-7d29-4208-85d3-d5fdbb79b150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 NetworkManager[49076]: <info>  [1768920288.0400] manager: (tapa19e9d1a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.078 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[96845c42-8188-4f66-8a3d-f876725bd3b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.081 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f48592c1-557b-432f-86fa-da3d9aa45e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 NetworkManager[49076]: <info>  [1768920288.0986] device (tapa19e9d1a-80): carrier: link connected
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.103 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7f00a1-6635-45c0-862d-3184aa77b357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.120 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[214050e5-226d-4662-83ed-d0edb548c86c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532978, 'reachable_time': 37574, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258061, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.135 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78354f3d-f3db-44d4-bce6-08af4647a8e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532978, 'tstamp': 532978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258062, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.158 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.167 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fdf098-dd4e-4a8b-ae31-9a94fb8bf8a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532978, 'reachable_time': 37574, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258063, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:48Z|00342|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a ovn-installed in OVS
Jan 20 09:44:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:48Z|00343|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a up in Southbound
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.198 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4461ac8c-6c73-4dfc-bf7d-df13c7c6b80d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:48.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.248 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0b333fd0-54dd-43b7-8f3a-2e58d0d1cca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.249 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.249 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.249 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:48 np0005588920 kernel: tapa19e9d1a-80: entered promiscuous mode
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 NetworkManager[49076]: <info>  [1768920288.2518] manager: (tapa19e9d1a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.254 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:48Z|00344|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=1)
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.256 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.257 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[00af5170-2cb1-47df-b92d-7de2b97ba983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.258 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:48.259 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'env', 'PROCESS_TAG=haproxy-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a19e9d1a-864f-41ee-bdea-188e65973ea5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.450 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920288.4499989, cc7de61a-b40f-4367-873d-c51b6f29310b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.450 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.471 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.474 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920288.451213, cc7de61a-b40f-4367-873d-c51b6f29310b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.475 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.507 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.512 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.537 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:48 np0005588920 podman[258137]: 2026-01-20 14:44:48.611859991 +0000 UTC m=+0.048890574 container create 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:44:48 np0005588920 systemd[1]: Started libpod-conmon-1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43.scope.
Jan 20 09:44:48 np0005588920 podman[258137]: 2026-01-20 14:44:48.585765504 +0000 UTC m=+0.022796067 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:48 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:44:48 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc6f75607ebd2f6f152495500a34a77353a3e2d95d66ff024c0b1b15c366c4cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:48 np0005588920 podman[258137]: 2026-01-20 14:44:48.703811664 +0000 UTC m=+0.140842207 container init 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 09:44:48 np0005588920 podman[258137]: 2026-01-20 14:44:48.710527062 +0000 UTC m=+0.147557605 container start 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:48 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [NOTICE]   (258156) : New worker (258158) forked
Jan 20 09:44:48 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [NOTICE]   (258156) : Loading success.
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.807 226890 DEBUG nova.compute.manager [req-abf41946-56aa-4ba3-b2d7-6a21813d06be req-ad2e8ee9-7375-42f8-9469-f8e892743f4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.807 226890 DEBUG oslo_concurrency.lockutils [req-abf41946-56aa-4ba3-b2d7-6a21813d06be req-ad2e8ee9-7375-42f8-9469-f8e892743f4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.807 226890 DEBUG oslo_concurrency.lockutils [req-abf41946-56aa-4ba3-b2d7-6a21813d06be req-ad2e8ee9-7375-42f8-9469-f8e892743f4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.808 226890 DEBUG oslo_concurrency.lockutils [req-abf41946-56aa-4ba3-b2d7-6a21813d06be req-ad2e8ee9-7375-42f8-9469-f8e892743f4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.808 226890 DEBUG nova.compute.manager [req-abf41946-56aa-4ba3-b2d7-6a21813d06be req-ad2e8ee9-7375-42f8-9469-f8e892743f4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Processing event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.808 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.811 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920288.8112888, cc7de61a-b40f-4367-873d-c51b6f29310b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.811 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.813 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.815 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance spawned successfully.#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.816 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.858 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.866 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.870 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.870 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.871 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.871 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.872 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.873 226890 DEBUG nova.virt.libvirt.driver [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.933 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.957 226890 INFO nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Took 10.28 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:48 np0005588920 nova_compute[226886]: 2026-01-20 14:44:48.958 226890 DEBUG nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:49 np0005588920 nova_compute[226886]: 2026-01-20 14:44:49.018 226890 INFO nova.compute.manager [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Took 11.18 seconds to build instance.#033[00m
Jan 20 09:44:49 np0005588920 nova_compute[226886]: 2026-01-20 14:44:49.037 226890 DEBUG oslo_concurrency.lockutils [None req-a74d2126-2145-4b80-9a23-15c4e7bcce96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:49.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:49 np0005588920 nova_compute[226886]: 2026-01-20 14:44:49.602 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Successfully created port: 254849e1-8318-4ef7-919b-0cdab5b4bc42 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:44:50 np0005588920 nova_compute[226886]: 2026-01-20 14:44:50.105 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:50.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:50 np0005588920 nova_compute[226886]: 2026-01-20 14:44:50.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:51.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.726 226890 DEBUG nova.compute.manager [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.727 226890 DEBUG oslo_concurrency.lockutils [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.727 226890 DEBUG oslo_concurrency.lockutils [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.728 226890 DEBUG oslo_concurrency.lockutils [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.728 226890 DEBUG nova.compute.manager [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.728 226890 WARNING nova.compute.manager [req-c0062cb3-2d7f-4d1c-816b-cee741d75d03 req-7875d838-d449-4345-adeb-2b94d8e1bfc2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state active and task_state None.#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.847 226890 DEBUG oslo_concurrency.lockutils [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.847 226890 DEBUG oslo_concurrency.lockutils [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.847 226890 DEBUG nova.compute.manager [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.850 226890 DEBUG nova.compute.manager [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.851 226890 DEBUG nova.objects.instance [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'flavor' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:51 np0005588920 nova_compute[226886]: 2026-01-20 14:44:51.877 226890 DEBUG nova.virt.libvirt.driver [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:44:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:52.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:52 np0005588920 nova_compute[226886]: 2026-01-20 14:44:52.299 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Successfully updated port: 254849e1-8318-4ef7-919b-0cdab5b4bc42 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:44:52 np0005588920 nova_compute[226886]: 2026-01-20 14:44:52.316 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:52 np0005588920 nova_compute[226886]: 2026-01-20 14:44:52.317 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquired lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:52 np0005588920 nova_compute[226886]: 2026-01-20 14:44:52.317 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:44:52 np0005588920 nova_compute[226886]: 2026-01-20 14:44:52.481 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:44:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:53Z|00345|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:53.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.470 226890 DEBUG nova.network.neutron [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Updating instance_info_cache with network_info: [{"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.493 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Releasing lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.493 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Instance network_info: |[{"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.495 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Start _get_guest_xml network_info=[{"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.499 226890 WARNING nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.503 226890 DEBUG nova.virt.libvirt.host [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.504 226890 DEBUG nova.virt.libvirt.host [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.507 226890 DEBUG nova.virt.libvirt.host [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.507 226890 DEBUG nova.virt.libvirt.host [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.509 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.509 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.509 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.510 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.510 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.510 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.510 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.510 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.511 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.511 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.511 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.511 226890 DEBUG nova.virt.hardware [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.514 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.889 226890 DEBUG nova.compute.manager [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-changed-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.890 226890 DEBUG nova.compute.manager [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Refreshing instance network info cache due to event network-changed-254849e1-8318-4ef7-919b-0cdab5b4bc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.891 226890 DEBUG oslo_concurrency.lockutils [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.891 226890 DEBUG oslo_concurrency.lockutils [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.891 226890 DEBUG nova.network.neutron [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Refreshing network info cache for port 254849e1-8318-4ef7-919b-0cdab5b4bc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366812670' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.963 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.989 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:53 np0005588920 nova_compute[226886]: 2026-01-20 14:44:53.993 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:54.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/733160525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.414 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.416 226890 DEBUG nova.virt.libvirt.vif [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1005928532',display_name='tempest-DeleteServersTestJSON-server-1005928532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1005928532',id=92,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-j0ouxdjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:45Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.417 226890 DEBUG nova.network.os_vif_util [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.418 226890 DEBUG nova.network.os_vif_util [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.420 226890 DEBUG nova.objects.instance [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.436 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <uuid>b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2</uuid>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <name>instance-0000005c</name>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersTestJSON-server-1005928532</nova:name>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:44:53</nova:creationTime>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:user uuid="37e9ef97fbe0448e9fbe32d48b66211f">tempest-DeleteServersTestJSON-1162922273-project-member</nova:user>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:project uuid="3b31139b2a4e49cba5e7048febf901c4">tempest-DeleteServersTestJSON-1162922273</nova:project>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <nova:port uuid="254849e1-8318-4ef7-919b-0cdab5b4bc42">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="serial">b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="uuid">b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:f9:20:3e"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <target dev="tap254849e1-83"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/console.log" append="off"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:44:54 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:44:54 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:44:54 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:44:54 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.444 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Preparing to wait for external event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.445 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.445 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.445 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.446 226890 DEBUG nova.virt.libvirt.vif [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1005928532',display_name='tempest-DeleteServersTestJSON-server-1005928532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1005928532',id=92,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-j0ouxdjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:45Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.447 226890 DEBUG nova.network.os_vif_util [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.448 226890 DEBUG nova.network.os_vif_util [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.448 226890 DEBUG os_vif [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.449 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.450 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.451 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.455 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap254849e1-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.455 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap254849e1-83, col_values=(('external_ids', {'iface-id': '254849e1-8318-4ef7-919b-0cdab5b4bc42', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:20:3e', 'vm-uuid': 'b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:54 np0005588920 NetworkManager[49076]: <info>  [1768920294.4580] manager: (tap254849e1-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.460 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.463 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.465 226890 INFO os_vif [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83')#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.555 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.555 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.556 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No VIF found with MAC fa:16:3e:f9:20:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.556 226890 INFO nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Using config drive#033[00m
Jan 20 09:44:54 np0005588920 nova_compute[226886]: 2026-01-20 14:44:54.580 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.218 226890 INFO nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Creating config drive at /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.226 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe_s086dt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.330 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.333 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.334 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.376 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe_s086dt" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.377 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:55.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.454 226890 DEBUG nova.storage.rbd_utils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] rbd image b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.458 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.481 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.617 226890 DEBUG oslo_concurrency.processutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.618 226890 INFO nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Deleting local config drive /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.641 226890 DEBUG nova.network.neutron [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Updated VIF entry in instance network info cache for port 254849e1-8318-4ef7-919b-0cdab5b4bc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.642 226890 DEBUG nova.network.neutron [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Updating instance_info_cache with network_info: [{"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.665 226890 DEBUG oslo_concurrency.lockutils [req-82f4cc5b-9955-42bb-81a2-2c03ec8c623f req-a391a362-32a9-419f-a629-63b7cb346fb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:55 np0005588920 kernel: tap254849e1-83: entered promiscuous mode
Jan 20 09:44:55 np0005588920 NetworkManager[49076]: <info>  [1768920295.6825] manager: (tap254849e1-83): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Jan 20 09:44:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:55Z|00346|binding|INFO|Claiming lport 254849e1-8318-4ef7-919b-0cdab5b4bc42 for this chassis.
Jan 20 09:44:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:55Z|00347|binding|INFO|254849e1-8318-4ef7-919b-0cdab5b4bc42: Claiming fa:16:3e:f9:20:3e 10.100.0.13
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.694 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:20:3e 10.100.0.13'], port_security=['fa:16:3e:f9:20:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=254849e1-8318-4ef7-919b-0cdab5b4bc42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.695 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 254849e1-8318-4ef7-919b-0cdab5b4bc42 in datapath fbd5d614-a7d3-4563-913c-104506628e59 bound to our chassis#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.696 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd5d614-a7d3-4563-913c-104506628e59#033[00m
Jan 20 09:44:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:55Z|00348|binding|INFO|Setting lport 254849e1-8318-4ef7-919b-0cdab5b4bc42 ovn-installed in OVS
Jan 20 09:44:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:55Z|00349|binding|INFO|Setting lport 254849e1-8318-4ef7-919b-0cdab5b4bc42 up in Southbound
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.713 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[56291585-b0cd-41e9-bd13-248f5442a4fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.714 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbd5d614-a1 in ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.714 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588920 nova_compute[226886]: 2026-01-20 14:44:55.718 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.717 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbd5d614-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.717 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48ca655b-63c6-471d-92d8-b167937b6149]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.721 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[06b71634-74db-4830-a8bd-1aa779515363]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 systemd-udevd[258303]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:55 np0005588920 systemd-machined[196121]: New machine qemu-38-instance-0000005c.
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.743 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[810f1950-2271-48f8-b76f-d6dc371ef96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 NetworkManager[49076]: <info>  [1768920295.7466] device (tap254849e1-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:55 np0005588920 NetworkManager[49076]: <info>  [1768920295.7482] device (tap254849e1-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:55 np0005588920 systemd[1]: Started Virtual Machine qemu-38-instance-0000005c.
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.757 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a098660f-3496-44c8-b331-8fd98d156ac4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.792 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8243669f-93a7-418a-864e-83ebd3ea76c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 systemd-udevd[258307]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.797 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ba1460-4253-44df-b7ec-2d2eeaddf797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 NetworkManager[49076]: <info>  [1768920295.7994] manager: (tapfbd5d614-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.826 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[13c88885-8850-456c-9fc3-578269bc4486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.829 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[93d46007-3ec4-43fb-a485-d406af8f9a9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 NetworkManager[49076]: <info>  [1768920295.8516] device (tapfbd5d614-a0): carrier: link connected
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.857 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2d40bc3f-ca68-426f-a4b3-fd164c012702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.877 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aa488122-6e54-47f5-bb3a-7c03f8b44cc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533753, 'reachable_time': 42138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258336, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.895 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[be3382cb-620a-4c25-ac9a-73d6bf57687c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:38be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533753, 'tstamp': 533753}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258337, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.910 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2b6c86-945c-4ec7-b5da-aac5aa7bfe6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533753, 'reachable_time': 42138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258338, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:55.940 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b79ae3fd-a8bd-451a-8606-2b3c8f6f168f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.000 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1659be3c-abac-4fa1-a9c7-cc1fac71c7cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.002 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.002 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.003 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd5d614-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.004 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588920 kernel: tapfbd5d614-a0: entered promiscuous mode
Jan 20 09:44:56 np0005588920 NetworkManager[49076]: <info>  [1768920296.0052] manager: (tapfbd5d614-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.006 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.009 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd5d614-a0, col_values=(('external_ids', {'iface-id': 'b370b74e-dca0-4ff7-a96f-85b392e20721'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.010 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588920 ovn_controller[133971]: 2026-01-20T14:44:56Z|00350|binding|INFO|Releasing lport b370b74e-dca0-4ff7-a96f-85b392e20721 from this chassis (sb_readonly=0)
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.011 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.013 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.023 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e91510d-75fa-41c2-8960-6ca7b62143cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.026 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:44:56.026 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'env', 'PROCESS_TAG=haproxy-fbd5d614-a7d3-4563-913c-104506628e59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbd5d614-a7d3-4563-913c-104506628e59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:56.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:56 np0005588920 podman[258386]: 2026-01-20 14:44:56.366387207 +0000 UTC m=+0.026381006 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:56 np0005588920 podman[258386]: 2026-01-20 14:44:56.538861405 +0000 UTC m=+0.198855154 container create cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:56 np0005588920 systemd[1]: Started libpod-conmon-cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84.scope.
Jan 20 09:44:56 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:44:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/190c992ec9d7a57881e9cbcc0cf24463c658a32e0b826be30c5219acf208c1fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.622 226890 DEBUG nova.compute.manager [req-1f880e6b-0cce-4310-a256-1ab57517c449 req-416a224a-068e-4153-beb2-d8aa2e2823aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.622 226890 DEBUG oslo_concurrency.lockutils [req-1f880e6b-0cce-4310-a256-1ab57517c449 req-416a224a-068e-4153-beb2-d8aa2e2823aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.623 226890 DEBUG oslo_concurrency.lockutils [req-1f880e6b-0cce-4310-a256-1ab57517c449 req-416a224a-068e-4153-beb2-d8aa2e2823aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.623 226890 DEBUG oslo_concurrency.lockutils [req-1f880e6b-0cce-4310-a256-1ab57517c449 req-416a224a-068e-4153-beb2-d8aa2e2823aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.623 226890 DEBUG nova.compute.manager [req-1f880e6b-0cce-4310-a256-1ab57517c449 req-416a224a-068e-4153-beb2-d8aa2e2823aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Processing event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:56 np0005588920 podman[258386]: 2026-01-20 14:44:56.625887681 +0000 UTC m=+0.285881450 container init cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:56 np0005588920 podman[258386]: 2026-01-20 14:44:56.632711291 +0000 UTC m=+0.292705040 container start cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:56 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [NOTICE]   (258429) : New worker (258432) forked
Jan 20 09:44:56 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [NOTICE]   (258429) : Loading success.
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.681 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920296.6805081, b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.681 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.684 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.693 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.696 226890 INFO nova.virt.libvirt.driver [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Instance spawned successfully.#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.696 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.714 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.719 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.723 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.723 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.724 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.724 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.725 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.725 226890 DEBUG nova.virt.libvirt.driver [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.773 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.773 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920296.6806598, b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.773 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.796 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.799 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920296.692893, b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.799 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.813 226890 INFO nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Took 11.34 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.813 226890 DEBUG nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.849 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.853 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.878 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.897 226890 INFO nova.compute.manager [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Took 12.35 seconds to build instance.#033[00m
Jan 20 09:44:56 np0005588920 nova_compute[226886]: 2026-01-20 14:44:56.921 226890 DEBUG oslo_concurrency.lockutils [None req-8f97a5d1-f3a4-4411-a0b7-a87366f728e0 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:57.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.879 226890 INFO nova.compute.manager [None req-a94881fb-f6c8-4bb9-8325-c1676e3773fa 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Pausing#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.880 226890 DEBUG nova.objects.instance [None req-a94881fb-f6c8-4bb9-8325-c1676e3773fa 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'flavor' on Instance uuid b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.901 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920297.9010727, b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.901 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.903 226890 DEBUG nova.compute.manager [None req-a94881fb-f6c8-4bb9-8325-c1676e3773fa 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.926 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:57 np0005588920 nova_compute[226886]: 2026-01-20 14:44:57.932 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:58.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.706 226890 DEBUG nova.compute.manager [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.707 226890 DEBUG oslo_concurrency.lockutils [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.708 226890 DEBUG oslo_concurrency.lockutils [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.708 226890 DEBUG oslo_concurrency.lockutils [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.708 226890 DEBUG nova.compute.manager [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] No waiting events found dispatching network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:58 np0005588920 nova_compute[226886]: 2026-01-20 14:44:58.708 226890 WARNING nova.compute.manager [req-56ebfeee-8636-49cd-a1d3-4d717d2a29d1 req-d5a04daf-b4e7-46db-b3db-9c5dacae30bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received unexpected event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 for instance with vm_state paused and task_state None.#033[00m
Jan 20 09:44:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:44:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:44:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:59.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:44:59 np0005588920 nova_compute[226886]: 2026-01-20 14:44:59.459 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:00.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:00 np0005588920 nova_compute[226886]: 2026-01-20 14:45:00.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.041 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.041 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.042 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.042 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.043 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.045 226890 INFO nova.compute.manager [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Terminating instance#033[00m
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.047 226890 DEBUG nova.compute.manager [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:45:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:01.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:01 np0005588920 nova_compute[226886]: 2026-01-20 14:45:01.926 226890 DEBUG nova.virt.libvirt.driver [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:45:02 np0005588920 kernel: tap254849e1-83 (unregistering): left promiscuous mode
Jan 20 09:45:02 np0005588920 NetworkManager[49076]: <info>  [1768920302.0490] device (tap254849e1-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.055 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:02Z|00351|binding|INFO|Releasing lport 254849e1-8318-4ef7-919b-0cdab5b4bc42 from this chassis (sb_readonly=0)
Jan 20 09:45:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:02Z|00352|binding|INFO|Setting lport 254849e1-8318-4ef7-919b-0cdab5b4bc42 down in Southbound
Jan 20 09:45:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:02Z|00353|binding|INFO|Removing iface tap254849e1-83 ovn-installed in OVS
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.064 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:20:3e 10.100.0.13'], port_security=['fa:16:3e:f9:20:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=254849e1-8318-4ef7-919b-0cdab5b4bc42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.066 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 254849e1-8318-4ef7-919b-0cdab5b4bc42 in datapath fbd5d614-a7d3-4563-913c-104506628e59 unbound from our chassis#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.068 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbd5d614-a7d3-4563-913c-104506628e59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.069 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8bb290-9773-403c-bab3-958d363953ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.070 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace which is not needed anymore#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.078 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 20 09:45:02 np0005588920 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005c.scope: Consumed 1.903s CPU time.
Jan 20 09:45:02 np0005588920 systemd-machined[196121]: Machine qemu-38-instance-0000005c terminated.
Jan 20 09:45:02 np0005588920 podman[258443]: 2026-01-20 14:45:02.182748094 +0000 UTC m=+0.104077031 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 20 09:45:02 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [NOTICE]   (258429) : haproxy version is 2.8.14-c23fe91
Jan 20 09:45:02 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [NOTICE]   (258429) : path to executable is /usr/sbin/haproxy
Jan 20 09:45:02 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [WARNING]  (258429) : Exiting Master process...
Jan 20 09:45:02 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [ALERT]    (258429) : Current worker (258432) exited with code 143 (Terminated)
Jan 20 09:45:02 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[258421]: [WARNING]  (258429) : All workers exited. Exiting... (0)
Jan 20 09:45:02 np0005588920 systemd[1]: libpod-cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84.scope: Deactivated successfully.
Jan 20 09:45:02 np0005588920 podman[258484]: 2026-01-20 14:45:02.207906775 +0000 UTC m=+0.048964255 container died cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:45:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:02.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.278 226890 INFO nova.virt.libvirt.driver [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Instance destroyed successfully.#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.279 226890 DEBUG nova.objects.instance [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'resources' on Instance uuid b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.295 226890 DEBUG nova.virt.libvirt.vif [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1005928532',display_name='tempest-DeleteServersTestJSON-server-1005928532',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1005928532',id=92,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-j0ouxdjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:44:57Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.296 226890 DEBUG nova.network.os_vif_util [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "address": "fa:16:3e:f9:20:3e", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254849e1-83", "ovs_interfaceid": "254849e1-8318-4ef7-919b-0cdab5b4bc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.296 226890 DEBUG nova.network.os_vif_util [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.297 226890 DEBUG os_vif [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.300 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap254849e1-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.303 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.308 226890 INFO os_vif [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:20:3e,bridge_name='br-int',has_traffic_filtering=True,id=254849e1-8318-4ef7-919b-0cdab5b4bc42,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254849e1-83')#033[00m
Jan 20 09:45:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84-userdata-shm.mount: Deactivated successfully.
Jan 20 09:45:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay-190c992ec9d7a57881e9cbcc0cf24463c658a32e0b826be30c5219acf208c1fc-merged.mount: Deactivated successfully.
Jan 20 09:45:02 np0005588920 podman[258484]: 2026-01-20 14:45:02.335682027 +0000 UTC m=+0.176739507 container cleanup cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:45:02 np0005588920 systemd[1]: libpod-conmon-cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84.scope: Deactivated successfully.
Jan 20 09:45:02 np0005588920 podman[258545]: 2026-01-20 14:45:02.457984047 +0000 UTC m=+0.103893447 container remove cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.463 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[81aa4380-0fbd-4186-9910-fead9e1f1469]: (4, ('Tue Jan 20 02:45:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84)\ncc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84\nTue Jan 20 02:45:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (cc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84)\ncc049919cf352e7c602972e087d24123a805b23e36ab3de8c916a79cb596ca84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.465 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0468e29d-bf7a-4d1f-8c08-55678219ebf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.467 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.469 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 kernel: tapfbd5d614-a0: left promiscuous mode
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.475 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b806b4fc-6403-45c6-b353-d4bf2c66643b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.493 226890 DEBUG nova.compute.manager [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-unplugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.494 226890 DEBUG oslo_concurrency.lockutils [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.494 226890 DEBUG oslo_concurrency.lockutils [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.494 226890 DEBUG oslo_concurrency.lockutils [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.494 226890 DEBUG nova.compute.manager [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] No waiting events found dispatching network-vif-unplugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.494 226890 DEBUG nova.compute.manager [req-f9afc211-7f0e-4c37-8360-cd9711b35907 req-b156a1f3-fc06-4ae4-b0e1-d7523eaffc05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-unplugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.505 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[18eeaeba-26bb-4a54-9f4e-0c93a09cbbad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.507 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d4344bf2-5685-42b5-8e88-e5260732e266]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.522 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4d98296a-c3ea-4315-99ef-7761994b7b45]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533747, 'reachable_time': 16530, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258566, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:02 np0005588920 nova_compute[226886]: 2026-01-20 14:45:02.554 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:02 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfbd5d614\x2da7d3\x2d4563\x2d913c\x2d104506628e59.mount: Deactivated successfully.
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.557 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:45:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:02.557 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[962e5fc4-7b8f-41c9-9d70-b48e24a357f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:03.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:03Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:9b:8e 10.100.0.11
Jan 20 09:45:03 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:03Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:9b:8e 10.100.0.11
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.147 226890 INFO nova.virt.libvirt.driver [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Deleting instance files /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_del#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.148 226890 INFO nova.virt.libvirt.driver [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Deletion of /var/lib/nova/instances/b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2_del complete#033[00m
Jan 20 09:45:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:04.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.230 226890 INFO nova.compute.manager [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Took 3.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.231 226890 DEBUG oslo.service.loopingcall [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.231 226890 DEBUG nova.compute.manager [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.231 226890 DEBUG nova.network.neutron [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.593 226890 DEBUG nova.compute.manager [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.594 226890 DEBUG oslo_concurrency.lockutils [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.595 226890 DEBUG oslo_concurrency.lockutils [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.595 226890 DEBUG oslo_concurrency.lockutils [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.595 226890 DEBUG nova.compute.manager [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] No waiting events found dispatching network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:04 np0005588920 nova_compute[226886]: 2026-01-20 14:45:04.596 226890 WARNING nova.compute.manager [req-f5ef831f-9458-4ae5-b0ad-47b5e94f9453 req-b8192fd4-ad3f-4719-8c92-feef9399c6cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received unexpected event network-vif-plugged-254849e1-8318-4ef7-919b-0cdab5b4bc42 for instance with vm_state paused and task_state deleting.#033[00m
Jan 20 09:45:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:45:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.188 226890 DEBUG nova.network.neutron [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.220 226890 INFO nova.compute.manager [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Took 0.99 seconds to deallocate network for instance.#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.272 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.272 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.289 226890 DEBUG nova.compute.manager [req-18303416-110f-4bfe-9ee1-38381b7174d7 req-5ac46a96-a97c-4c4f-bfe0-4650d0a9c747 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Received event network-vif-deleted-254849e1-8318-4ef7-919b-0cdab5b4bc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.354 226890 DEBUG oslo_concurrency.processutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.444 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:05.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1471590964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.808 226890 DEBUG oslo_concurrency.processutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.814 226890 DEBUG nova.compute.provider_tree [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.885 226890 DEBUG nova.scheduler.client.report [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.913 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.932 226890 INFO nova.scheduler.client.report [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Deleted allocations for instance b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2#033[00m
Jan 20 09:45:05 np0005588920 nova_compute[226886]: 2026-01-20 14:45:05.998 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:06 np0005588920 nova_compute[226886]: 2026-01-20 14:45:06.010 226890 DEBUG oslo_concurrency.lockutils [None req-a26f264c-bb96-4ad6-98b6-6d5ca96e6f68 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:06.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:07 np0005588920 nova_compute[226886]: 2026-01-20 14:45:07.303 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:45:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:07.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:45:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:08.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:09.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:10.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:10 np0005588920 nova_compute[226886]: 2026-01-20 14:45:10.446 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.295 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.295 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.323 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.393 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.393 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.403 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.403 226890 INFO nova.compute.claims [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:45:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:11.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:11 np0005588920 nova_compute[226886]: 2026-01-20 14:45:11.500 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:12 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/950872874' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.159 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.659s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.165 226890 DEBUG nova.compute.provider_tree [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.191 226890 DEBUG nova.scheduler.client.report [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.218 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.219 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:45:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:12.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.286 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.287 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.309 226890 INFO nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.328 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.427 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.429 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.430 226890 INFO nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Creating image(s)#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.475 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.503 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.528 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.532 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.599 226890 DEBUG nova.policy [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e9278fdb9e645b7938f3edb20c4d3cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.624 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.625 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.626 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.626 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.647 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.654 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 75736b87-b14e-45b7-b43b-5129cf7d3279_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:12 np0005588920 nova_compute[226886]: 2026-01-20 14:45:12.970 226890 DEBUG nova.virt.libvirt.driver [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:45:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:13.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:13 np0005588920 nova_compute[226886]: 2026-01-20 14:45:13.648 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Successfully created port: d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:45:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:14.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:14 np0005588920 nova_compute[226886]: 2026-01-20 14:45:14.388 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Successfully updated port: d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:45:14 np0005588920 nova_compute[226886]: 2026-01-20 14:45:14.402 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:14 np0005588920 nova_compute[226886]: 2026-01-20 14:45:14.402 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:14 np0005588920 nova_compute[226886]: 2026-01-20 14:45:14.402 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:45:14 np0005588920 nova_compute[226886]: 2026-01-20 14:45:14.983 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:45:15 np0005588920 podman[258887]: 2026-01-20 14:45:15.003998861 +0000 UTC m=+0.080501275 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.216 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 75736b87-b14e-45b7-b43b-5129cf7d3279_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.308 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] resizing rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.368 226890 DEBUG nova.compute.manager [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.368 226890 DEBUG nova.compute.manager [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing instance network info cache due to event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.369 226890 DEBUG oslo_concurrency.lockutils [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:15 np0005588920 nova_compute[226886]: 2026-01-20 14:45:15.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:15.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:16.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.398 226890 DEBUG nova.objects.instance [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.417 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.417 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Ensure instance console log exists: /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.418 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.418 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:16 np0005588920 nova_compute[226886]: 2026-01-20 14:45:16.419 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:16.449 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:16.449 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:16.450 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.278 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920302.2770567, b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.279 226890 INFO nova.compute.manager [-] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.306 226890 DEBUG nova.compute.manager [None req-9e6ecd01-fb12-4ec5-895f-922eb15bc943 - - - - - -] [instance: b84fbeb2-0fcd-47d3-9415-5cc2ed0a7ea2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.310 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 kernel: tap4de545f7-32 (unregistering): left promiscuous mode
Jan 20 09:45:17 np0005588920 NetworkManager[49076]: <info>  [1768920317.3680] device (tap4de545f7-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:45:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:17Z|00354|binding|INFO|Releasing lport 4de545f7-326a-4971-87cd-a23be2cbce6a from this chassis (sb_readonly=0)
Jan 20 09:45:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:17Z|00355|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a down in Southbound
Jan 20 09:45:17 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:17Z|00356|binding|INFO|Removing iface tap4de545f7-32 ovn-installed in OVS
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.385 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:9b:8e 10.100.0.11'], port_security=['fa:16:3e:19:9b:8e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc7de61a-b40f-4367-873d-c51b6f29310b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac411cec-795a-42a6-ba83-9468a87a4a14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de545f7-326a-4971-87cd-a23be2cbce6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.386 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de545f7-326a-4971-87cd-a23be2cbce6a in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.388 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a19e9d1a-864f-41ee-bdea-188e65973ea5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.389 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[85125111-c734-40fc-bd3b-ff0cacd71c2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.389 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace which is not needed anymore#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.397 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 20 09:45:17 np0005588920 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Consumed 14.314s CPU time.
Jan 20 09:45:17 np0005588920 systemd-machined[196121]: Machine qemu-37-instance-0000005a terminated.
Jan 20 09:45:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:17.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:17 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [NOTICE]   (258156) : haproxy version is 2.8.14-c23fe91
Jan 20 09:45:17 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [NOTICE]   (258156) : path to executable is /usr/sbin/haproxy
Jan 20 09:45:17 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [WARNING]  (258156) : Exiting Master process...
Jan 20 09:45:17 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [ALERT]    (258156) : Current worker (258158) exited with code 143 (Terminated)
Jan 20 09:45:17 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258152]: [WARNING]  (258156) : All workers exited. Exiting... (0)
Jan 20 09:45:17 np0005588920 systemd[1]: libpod-1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43.scope: Deactivated successfully.
Jan 20 09:45:17 np0005588920 podman[259004]: 2026-01-20 14:45:17.512435397 +0000 UTC m=+0.040005367 container died 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:45:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43-userdata-shm.mount: Deactivated successfully.
Jan 20 09:45:17 np0005588920 systemd[1]: var-lib-containers-storage-overlay-bc6f75607ebd2f6f152495500a34a77353a3e2d95d66ff024c0b1b15c366c4cf-merged.mount: Deactivated successfully.
Jan 20 09:45:17 np0005588920 podman[259004]: 2026-01-20 14:45:17.54842196 +0000 UTC m=+0.075991930 container cleanup 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:45:17 np0005588920 systemd[1]: libpod-conmon-1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43.scope: Deactivated successfully.
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.591 226890 DEBUG nova.network.neutron [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:17 np0005588920 podman[259035]: 2026-01-20 14:45:17.60904679 +0000 UTC m=+0.042428884 container remove 1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.610 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.612 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance network_info: |[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.612 226890 DEBUG oslo_concurrency.lockutils [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.613 226890 DEBUG nova.network.neutron [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.615 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start _get_guest_xml network_info=[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.619 226890 WARNING nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.623 226890 DEBUG nova.virt.libvirt.host [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.624 226890 DEBUG nova.virt.libvirt.host [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.627 226890 DEBUG nova.virt.libvirt.host [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.627 226890 DEBUG nova.virt.libvirt.host [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.629 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.629 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.629 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.630 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.630 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.630 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.631 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.631 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.631 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.631 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.632 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.632 226890 DEBUG nova.virt.hardware [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.634 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.647 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[40674e54-35b4-4d6b-a72d-2f445de907e7]: (4, ('Tue Jan 20 02:45:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43)\n1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43\nTue Jan 20 02:45:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43)\n1bf65416d8513e17f0c7eebcbad247985d561759ae695283474d9d6ced49ea43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.648 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aab587a2-c9cd-4240-9bd0-5c83c0000c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.649 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.655 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 kernel: tapa19e9d1a-80: left promiscuous mode
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.672 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[01e1e94d-2f7c-40c4-8dfa-f438f2950fa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.686 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[623790d5-274f-4a3b-82fe-7eb7b3965ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.687 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bd68a0ae-a773-4c4b-9382-995abddfdf93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.703 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[51a15dc5-41ed-4181-bdfc-411a6e0e023a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532970, 'reachable_time': 28219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259063, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 systemd[1]: run-netns-ovnmeta\x2da19e9d1a\x2d864f\x2d41ee\x2dbdea\x2d188e65973ea5.mount: Deactivated successfully.
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.705 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:45:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:17.705 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[2afa621e-60e2-49ea-a601-395211628734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.822 226890 DEBUG nova.compute.manager [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.822 226890 DEBUG oslo_concurrency.lockutils [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.823 226890 DEBUG oslo_concurrency.lockutils [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.823 226890 DEBUG oslo_concurrency.lockutils [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.823 226890 DEBUG nova.compute.manager [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:17 np0005588920 nova_compute[226886]: 2026-01-20 14:45:17.824 226890 WARNING nova.compute.manager [req-6ab018cb-02a3-45d7-a437-02e125894150 req-7c46968e-94fe-410c-bf31-6c53a2f348b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state active and task_state powering-off.#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.001 226890 INFO nova.virt.libvirt.driver [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance shutdown successfully after 26 seconds.#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.007 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance destroyed successfully.#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.008 226890 DEBUG nova.objects.instance [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'numa_topology' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.019 226890 DEBUG nova.compute.manager [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.060 226890 DEBUG oslo_concurrency.lockutils [None req-30f83f89-1b3c-4746-bdef-3ce4ccda7357 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 26.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3296741815' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.084 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.107 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.110 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:18.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/129227990' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.534 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.536 226890 DEBUG nova.virt.libvirt.vif [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.536 226890 DEBUG nova.network.os_vif_util [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.537 226890 DEBUG nova.network.os_vif_util [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.538 226890 DEBUG nova.objects.instance [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.560 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <uuid>75736b87-b14e-45b7-b43b-5129cf7d3279</uuid>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <name>instance-0000005e</name>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1202945337</nova:name>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:45:17</nova:creationTime>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <nova:port uuid="d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="serial">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="uuid">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:22:f9:d2"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <target dev="tapd3a9a684-c9"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log" append="off"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:45:18 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:45:18 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:45:18 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:45:18 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.561 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Preparing to wait for external event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.562 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.562 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.562 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.563 226890 DEBUG nova.virt.libvirt.vif [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.563 226890 DEBUG nova.network.os_vif_util [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.564 226890 DEBUG nova.network.os_vif_util [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.564 226890 DEBUG os_vif [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.565 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.565 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.566 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.568 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.568 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a9a684-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.568 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a9a684-c9, col_values=(('external_ids', {'iface-id': 'd3a9a684-c9a7-4abc-a085-9dcd17bfc2e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:f9:d2', 'vm-uuid': '75736b87-b14e-45b7-b43b-5129cf7d3279'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:18 np0005588920 NetworkManager[49076]: <info>  [1768920318.5710] manager: (tapd3a9a684-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.577 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.577 226890 INFO os_vif [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.634 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.634 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.634 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:22:f9:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.635 226890 INFO nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Using config drive#033[00m
Jan 20 09:45:18 np0005588920 nova_compute[226886]: 2026-01-20 14:45:18.659 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.919 226890 DEBUG nova.compute.manager [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.919 226890 DEBUG oslo_concurrency.lockutils [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.920 226890 DEBUG oslo_concurrency.lockutils [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.920 226890 DEBUG oslo_concurrency.lockutils [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.920 226890 DEBUG nova.compute.manager [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.920 226890 WARNING nova.compute.manager [req-c55d9ca3-4dfc-4f8b-bada-a015e8a3d29f req-dc5abf14-e02c-4b0a-bf86-f23475aeb56d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.960 226890 INFO nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Creating config drive at /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config#033[00m
Jan 20 09:45:19 np0005588920 nova_compute[226886]: 2026-01-20 14:45:19.968 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_kcmztl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.095 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_kcmztl" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.145 226890 DEBUG nova.storage.rbd_utils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.149 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config 75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:20.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.475 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.491 226890 DEBUG nova.network.neutron [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated VIF entry in instance network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.492 226890 DEBUG nova.network.neutron [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.520 226890 DEBUG oslo_concurrency.lockutils [req-fc444285-0239-454c-bf62-8e6889364690 req-443e9b4e-2a7d-44fe-9b0d-c73732422f7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.878 226890 DEBUG oslo_concurrency.processutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config 75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.729s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.878 226890 INFO nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Deleting local config drive /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/disk.config because it was imported into RBD.#033[00m
Jan 20 09:45:20 np0005588920 kernel: tapd3a9a684-c9: entered promiscuous mode
Jan 20 09:45:20 np0005588920 NetworkManager[49076]: <info>  [1768920320.9328] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 20 09:45:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:20Z|00357|binding|INFO|Claiming lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for this chassis.
Jan 20 09:45:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:20Z|00358|binding|INFO|d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6: Claiming fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.955 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:20 np0005588920 systemd-machined[196121]: New machine qemu-39-instance-0000005e.
Jan 20 09:45:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:20Z|00359|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 ovn-installed in OVS
Jan 20 09:45:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:20Z|00360|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 up in Southbound
Jan 20 09:45:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:20.986 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:20.987 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.987 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:20 np0005588920 systemd[1]: Started Virtual Machine qemu-39-instance-0000005e.
Jan 20 09:45:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:20.989 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.994 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:20 np0005588920 nova_compute[226886]: 2026-01-20 14:45:20.997 226890 INFO nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Rebuilding instance#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.000 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba1de15-a524-4bd4-b425-249f6f2136dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.001 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.003 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.003 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a38b8d0c-5620-4673-90ff-0f06659531c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.004 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a339b2-a6d3-4c7d-8390-3ed20ba71ec3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 systemd-udevd[259203]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:45:21 np0005588920 NetworkManager[49076]: <info>  [1768920321.0211] device (tapd3a9a684-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:45:21 np0005588920 NetworkManager[49076]: <info>  [1768920321.0218] device (tapd3a9a684-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.019 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8150a690-8929-42b2-ab7c-7e2998074969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.047 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[591fc4f6-46c9-42f1-be83-d5ab14d670b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.092 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[54727105-6314-494d-a88c-9afbf6318bec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 NetworkManager[49076]: <info>  [1768920321.1019] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.101 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[70d07f38-0394-459f-b500-fbcd60dc7731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 systemd-udevd[259207]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.141 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f039c8ca-c5c5-4aaf-a6c7-1c82cd5edf56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.145 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f95685e7-2932-4599-81e0-d03ff4c83176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 NetworkManager[49076]: <info>  [1768920321.1679] device (tap762e1859-40): carrier: link connected
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.172 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[99c43909-51cd-4462-94d3-549a8891e9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.192 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b0190b80-f931-4aaa-922b-7f2d436c4ede]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536285, 'reachable_time': 29177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259238, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45c267e8-c432-4ccb-8d90-887ca6160c5f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536285, 'tstamp': 536285}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259239, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.223 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[31aed81d-fd81-4b57-82f1-21b9d971c1a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536285, 'reachable_time': 29177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259240, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.251 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c5cc1d-1c16-4fae-badc-849ac28c87f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.306 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5b898ca5-4054-4024-97c4-f3d437348c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.307 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.309 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 NetworkManager[49076]: <info>  [1768920321.3107] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 20 09:45:21 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.319 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:21Z|00361|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.340 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.341 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[064a6f00-b006-4ef6-b1e8-b3df3eb25aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.342 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:45:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:21.343 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.370 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'trusted_certs' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.384 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.420 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_requests' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.429 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.440 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.452 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.464 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.469 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance already shutdown.#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.476 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance destroyed successfully.#033[00m
Jan 20 09:45:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:21.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.484 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance destroyed successfully.#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.485 226890 DEBUG nova.virt.libvirt.vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:20Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.486 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.487 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.487 226890 DEBUG os_vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.490 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4de545f7-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.496 226890 INFO os_vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32')#033[00m
Jan 20 09:45:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:21 np0005588920 podman[259329]: 2026-01-20 14:45:21.764482186 +0000 UTC m=+0.079540658 container create aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.784 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920321.7837224, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.786 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Started (Lifecycle Event)#033[00m
Jan 20 09:45:21 np0005588920 systemd[1]: Started libpod-conmon-aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34.scope.
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.805 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.809 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920321.7838244, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.809 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:45:21 np0005588920 podman[259329]: 2026-01-20 14:45:21.7186878 +0000 UTC m=+0.033746292 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.834 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:21 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.839 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48df6bf2eed0decf8cd70a6cf35494807ab4e0f52710a6f2cf43bdce0e1b5452/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.861 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:45:21 np0005588920 podman[259329]: 2026-01-20 14:45:21.870162432 +0000 UTC m=+0.185220934 container init aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:45:21 np0005588920 podman[259329]: 2026-01-20 14:45:21.87655271 +0000 UTC m=+0.191611172 container start aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.881 226890 DEBUG nova.compute.manager [req-f6cb1ecb-3d48-44fd-99c1-0d737a9c9329 req-ad87a46e-30ce-4313-b92c-e2f4d7f2593b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.881 226890 DEBUG oslo_concurrency.lockutils [req-f6cb1ecb-3d48-44fd-99c1-0d737a9c9329 req-ad87a46e-30ce-4313-b92c-e2f4d7f2593b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.882 226890 DEBUG oslo_concurrency.lockutils [req-f6cb1ecb-3d48-44fd-99c1-0d737a9c9329 req-ad87a46e-30ce-4313-b92c-e2f4d7f2593b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.882 226890 DEBUG oslo_concurrency.lockutils [req-f6cb1ecb-3d48-44fd-99c1-0d737a9c9329 req-ad87a46e-30ce-4313-b92c-e2f4d7f2593b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.882 226890 DEBUG nova.compute.manager [req-f6cb1ecb-3d48-44fd-99c1-0d737a9c9329 req-ad87a46e-30ce-4313-b92c-e2f4d7f2593b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Processing event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.883 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.887 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920321.8871877, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.887 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.889 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.893 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance spawned successfully.#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.894 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:45:21 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [NOTICE]   (259354) : New worker (259356) forked
Jan 20 09:45:21 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [NOTICE]   (259354) : Loading success.
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.909 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.916 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.916 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.917 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.917 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.918 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.918 226890 DEBUG nova.virt.libvirt.driver [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.921 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.949 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.986 226890 INFO nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Took 9.56 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:45:21 np0005588920 nova_compute[226886]: 2026-01-20 14:45:21.987 226890 DEBUG nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.054 226890 INFO nova.compute.manager [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Took 10.68 seconds to build instance.#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.074 226890 DEBUG oslo_concurrency.lockutils [None req-7c7ebffe-eee5-49a0-b357-de8488ee798b 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.131 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deleting instance files /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b_del#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.132 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deletion of /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b_del complete#033[00m
Jan 20 09:45:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:22.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.298 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.298 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating image(s)#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.319 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.344 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.368 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.371 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.442 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.443 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.444 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.444 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.471 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.475 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c cc7de61a-b40f-4367-873d-c51b6f29310b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:22 np0005588920 nova_compute[226886]: 2026-01-20 14:45:22.974 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c cc7de61a-b40f-4367-873d-c51b6f29310b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.036 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.129 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.130 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Ensure instance console log exists: /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.130 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.131 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.131 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.133 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start _get_guest_xml network_info=[{"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.136 226890 WARNING nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.141 226890 DEBUG nova.virt.libvirt.host [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.142 226890 DEBUG nova.virt.libvirt.host [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.145 226890 DEBUG nova.virt.libvirt.host [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.145 226890 DEBUG nova.virt.libvirt.host [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.146 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.146 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.147 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.147 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.147 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.147 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.147 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.virt.hardware [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.148 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'vcpu_model' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.163 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:23.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1557210698' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.610 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.633 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.637 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.985 226890 DEBUG nova.compute.manager [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.986 226890 DEBUG oslo_concurrency.lockutils [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.986 226890 DEBUG oslo_concurrency.lockutils [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.986 226890 DEBUG oslo_concurrency.lockutils [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.986 226890 DEBUG nova.compute.manager [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:23 np0005588920 nova_compute[226886]: 2026-01-20 14:45:23.986 226890 WARNING nova.compute.manager [req-082b1b91-ab52-4372-a5ba-cf6c18aaf395 req-71c1c15c-091c-4340-8ee4-d68f99c61bd9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:45:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:24.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4037203044' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.302 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.666s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.305 226890 DEBUG nova.virt.libvirt.vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:22Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.305 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.307 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.312 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <uuid>cc7de61a-b40f-4367-873d-c51b6f29310b</uuid>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <name>instance-0000005a</name>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:name>tempest-tempest.common.compute-instance-1867010105</nova:name>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:45:23</nova:creationTime>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <nova:port uuid="4de545f7-326a-4971-87cd-a23be2cbce6a">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="serial">cc7de61a-b40f-4367-873d-c51b6f29310b</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="uuid">cc7de61a-b40f-4367-873d-c51b6f29310b</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cc7de61a-b40f-4367-873d-c51b6f29310b_disk">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:19:9b:8e"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <target dev="tap4de545f7-32"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/console.log" append="off"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:45:24 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:45:24 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:45:24 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:45:24 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.314 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Preparing to wait for external event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.315 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.315 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.316 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.317 226890 DEBUG nova.virt.libvirt.vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:22Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:45:24 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.317 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.319 226890 DEBUG nova.network.os_vif_util [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.320 226890 DEBUG os_vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.322 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.323 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.325 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.326 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4de545f7-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.327 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4de545f7-32, col_values=(('external_ids', {'iface-id': '4de545f7-326a-4971-87cd-a23be2cbce6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:9b:8e', 'vm-uuid': 'cc7de61a-b40f-4367-873d-c51b6f29310b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.329 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:24 np0005588920 NetworkManager[49076]: <info>  [1768920324.3298] manager: (tap4de545f7-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.337 226890 INFO os_vif [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32')#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.440 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.441 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.441 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:19:9b:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.442 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Using config drive#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.471 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.487 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'ec2_ids' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.521 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'keypairs' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.857 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Creating config drive at /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.862 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpop_7ygxd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:24 np0005588920 nova_compute[226886]: 2026-01-20 14:45:24.990 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpop_7ygxd" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.428 226890 DEBUG nova.storage.rbd_utils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.431 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:25.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.676 226890 DEBUG oslo_concurrency.processutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config cc7de61a-b40f-4367-873d-c51b6f29310b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.244s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.677 226890 INFO nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deleting local config drive /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:45:25 np0005588920 kernel: tap4de545f7-32: entered promiscuous mode
Jan 20 09:45:25 np0005588920 NetworkManager[49076]: <info>  [1768920325.7281] manager: (tap4de545f7-32): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 20 09:45:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:25Z|00362|binding|INFO|Claiming lport 4de545f7-326a-4971-87cd-a23be2cbce6a for this chassis.
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:25Z|00363|binding|INFO|4de545f7-326a-4971-87cd-a23be2cbce6a: Claiming fa:16:3e:19:9b:8e 10.100.0.11
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.741 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:9b:8e 10.100.0.11'], port_security=['fa:16:3e:19:9b:8e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc7de61a-b40f-4367-873d-c51b6f29310b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ac411cec-795a-42a6-ba83-9468a87a4a14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de545f7-326a-4971-87cd-a23be2cbce6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.742 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de545f7-326a-4971-87cd-a23be2cbce6a in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.743 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.757 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39f086b1-de10-4e40-89d7-6918b7e1031a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.758 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa19e9d1a-81 in ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.760 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa19e9d1a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.760 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7295261d-821c-4c88-b153-0c13fe190215]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:25Z|00364|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a ovn-installed in OVS
Jan 20 09:45:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:25Z|00365|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a up in Southbound
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.764 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.764 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa15ddf-df63-474b-b064-9b5dc3738420]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 systemd-udevd[259671]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.777 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[902b8e59-59fd-4176-9ba0-d64993ca12aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 systemd-machined[196121]: New machine qemu-40-instance-0000005a.
Jan 20 09:45:25 np0005588920 NetworkManager[49076]: <info>  [1768920325.7873] device (tap4de545f7-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:45:25 np0005588920 NetworkManager[49076]: <info>  [1768920325.7886] device (tap4de545f7-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:45:25 np0005588920 systemd[1]: Started Virtual Machine qemu-40-instance-0000005a.
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.802 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5eda45a-b55f-4919-9823-167dc7941b2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.830 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[db2047f7-409b-4b28-97cb-4dda14f029c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.837 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b4d32d-fc8c-43be-a5fa-8838ac7b1dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 NetworkManager[49076]: <info>  [1768920325.8386] manager: (tapa19e9d1a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.872 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7d28d03a-5978-4213-90c7-6dbb1a70667e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.874 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf5805a-cdcb-4ccf-a179-bb87a1ad138b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 NetworkManager[49076]: <info>  [1768920325.8984] device (tapa19e9d1a-80): carrier: link connected
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.906 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b19410d6-4fbe-47be-9b97-a997e03c81af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.922 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1e625bd9-6571-43d3-8e7d-ed6adf15c3d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536758, 'reachable_time': 26433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259704, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.937 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c0be0a-40a1-47cb-879a-9739a1789915]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536758, 'tstamp': 536758}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259705, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.951 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6eddf0a9-14d9-4628-8663-4af0779fa9c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536758, 'reachable_time': 26433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259706, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.964 226890 DEBUG nova.compute.manager [req-335953fa-f105-4763-9fea-2306733159a1 req-5a70efd4-81e7-4a59-a730-08923e2d089f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.965 226890 DEBUG oslo_concurrency.lockutils [req-335953fa-f105-4763-9fea-2306733159a1 req-5a70efd4-81e7-4a59-a730-08923e2d089f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.965 226890 DEBUG oslo_concurrency.lockutils [req-335953fa-f105-4763-9fea-2306733159a1 req-5a70efd4-81e7-4a59-a730-08923e2d089f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.965 226890 DEBUG oslo_concurrency.lockutils [req-335953fa-f105-4763-9fea-2306733159a1 req-5a70efd4-81e7-4a59-a730-08923e2d089f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:25 np0005588920 nova_compute[226886]: 2026-01-20 14:45:25.965 226890 DEBUG nova.compute.manager [req-335953fa-f105-4763-9fea-2306733159a1 req-5a70efd4-81e7-4a59-a730-08923e2d089f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Processing event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:45:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:25.983 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e33b18b-43e0-4d40-a034-ae41bd8dd10d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.036 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6742865c-8e3a-4e1a-9a8b-bfbf138c0332]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.038 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.039 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.039 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:26 np0005588920 NetworkManager[49076]: <info>  [1768920326.0426] manager: (tapa19e9d1a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.042 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 kernel: tapa19e9d1a-80: entered promiscuous mode
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.048 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.050 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:26Z|00366|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.054 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.055 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2acf5877-65f9-48d5-beab-e61c3673127e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.056 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.058 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'env', 'PROCESS_TAG=haproxy-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a19e9d1a-864f-41ee-bdea-188e65973ea5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.081293) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326081327, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2337, "num_deletes": 251, "total_data_size": 5186721, "memory_usage": 5269680, "flush_reason": "Manual Compaction"}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326097371, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3397927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39218, "largest_seqno": 41550, "table_properties": {"data_size": 3388811, "index_size": 5546, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20370, "raw_average_key_size": 20, "raw_value_size": 3370006, "raw_average_value_size": 3400, "num_data_blocks": 242, "num_entries": 991, "num_filter_entries": 991, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920134, "oldest_key_time": 1768920134, "file_creation_time": 1768920326, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 16133 microseconds, and 6817 cpu microseconds.
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.097423) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3397927 bytes OK
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.097444) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.099170) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.099210) EVENT_LOG_v1 {"time_micros": 1768920326099205, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.099226) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5176303, prev total WAL file size 5176303, number of live WAL files 2.
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.100297) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3318KB)], [75(9562KB)]
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326100337, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 13190372, "oldest_snapshot_seqno": -1}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6697 keys, 11261864 bytes, temperature: kUnknown
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326233986, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 11261864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11216076, "index_size": 27948, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 171825, "raw_average_key_size": 25, "raw_value_size": 11095111, "raw_average_value_size": 1656, "num_data_blocks": 1115, "num_entries": 6697, "num_filter_entries": 6697, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920326, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.234269) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 11261864 bytes
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.236255) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 98.6 rd, 84.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 9.3 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(7.2) write-amplify(3.3) OK, records in: 7212, records dropped: 515 output_compression: NoCompression
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.236278) EVENT_LOG_v1 {"time_micros": 1768920326236268, "job": 46, "event": "compaction_finished", "compaction_time_micros": 133732, "compaction_time_cpu_micros": 25985, "output_level": 6, "num_output_files": 1, "total_output_size": 11261864, "num_input_records": 7212, "num_output_records": 6697, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326236862, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326238660, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.100228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:26.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.295 226890 DEBUG nova.compute.manager [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.296 226890 DEBUG nova.compute.manager [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing instance network info cache due to event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.296 226890 DEBUG oslo_concurrency.lockutils [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.297 226890 DEBUG oslo_concurrency.lockutils [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.297 226890 DEBUG nova.network.neutron [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.373 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.374 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for cc7de61a-b40f-4367-873d-c51b6f29310b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.375 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920326.3741481, cc7de61a-b40f-4367-873d-c51b6f29310b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.375 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.379 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.386 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance spawned successfully.#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.386 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.409 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.413 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.416 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.417 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.417 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.417 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.418 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.418 226890 DEBUG nova.virt.libvirt.driver [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:26 np0005588920 podman[259777]: 2026-01-20 14:45:26.428346947 +0000 UTC m=+0.064977923 container create 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.439 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.439 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920326.374882, cc7de61a-b40f-4367-873d-c51b6f29310b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.441 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.462 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:26 np0005588920 systemd[1]: Started libpod-conmon-0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd.scope.
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.467 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920326.3783169, cc7de61a-b40f-4367-873d-c51b6f29310b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.467 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.470 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.482 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.484 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:26 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:45:26 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cb2ff7f18e2161082b161abc65eeae2be1f694c223f062d2a4294cde62396ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:45:26 np0005588920 podman[259777]: 2026-01-20 14:45:26.403226047 +0000 UTC m=+0.039857043 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:45:26 np0005588920 podman[259777]: 2026-01-20 14:45:26.503850612 +0000 UTC m=+0.140481608 container init 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:45:26 np0005588920 podman[259777]: 2026-01-20 14:45:26.508888652 +0000 UTC m=+0.145519628 container start 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.514 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.518 226890 INFO nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] bringing vm to original state: 'stopped'#033[00m
Jan 20 09:45:26 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [NOTICE]   (259795) : New worker (259797) forked
Jan 20 09:45:26 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [NOTICE]   (259795) : Loading success.
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.588 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.589 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.589 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.592 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 09:45:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:45:26 np0005588920 kernel: tap4de545f7-32 (unregistering): left promiscuous mode
Jan 20 09:45:26 np0005588920 NetworkManager[49076]: <info>  [1768920326.8460] device (tap4de545f7-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.857 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:26Z|00367|binding|INFO|Releasing lport 4de545f7-326a-4971-87cd-a23be2cbce6a from this chassis (sb_readonly=0)
Jan 20 09:45:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:26Z|00368|binding|INFO|Setting lport 4de545f7-326a-4971-87cd-a23be2cbce6a down in Southbound
Jan 20 09:45:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:26Z|00369|binding|INFO|Removing iface tap4de545f7-32 ovn-installed in OVS
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.864 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:9b:8e 10.100.0.11'], port_security=['fa:16:3e:19:9b:8e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc7de61a-b40f-4367-873d-c51b6f29310b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ac411cec-795a-42a6-ba83-9468a87a4a14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4de545f7-326a-4971-87cd-a23be2cbce6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.866 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4de545f7-326a-4971-87cd-a23be2cbce6a in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.867 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a19e9d1a-864f-41ee-bdea-188e65973ea5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.868 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b73207c9-6ebb-4b7e-b06b-45d9128790ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:26.868 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace which is not needed anymore#033[00m
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588920 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 20 09:45:26 np0005588920 systemd-machined[196121]: Machine qemu-40-instance-0000005a terminated.
Jan 20 09:45:26 np0005588920 nova_compute[226886]: 2026-01-20 14:45:26.950 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.037 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance destroyed successfully.#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.037 226890 DEBUG nova.compute.manager [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.106 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.150 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.151 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.151 226890 DEBUG nova.objects.instance [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:45:27 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [NOTICE]   (259795) : haproxy version is 2.8.14-c23fe91
Jan 20 09:45:27 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [NOTICE]   (259795) : path to executable is /usr/sbin/haproxy
Jan 20 09:45:27 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [WARNING]  (259795) : Exiting Master process...
Jan 20 09:45:27 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [ALERT]    (259795) : Current worker (259797) exited with code 143 (Terminated)
Jan 20 09:45:27 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[259791]: [WARNING]  (259795) : All workers exited. Exiting... (0)
Jan 20 09:45:27 np0005588920 systemd[1]: libpod-0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd.scope: Deactivated successfully.
Jan 20 09:45:27 np0005588920 podman[259827]: 2026-01-20 14:45:27.208509274 +0000 UTC m=+0.256891061 container died 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.222 226890 DEBUG oslo_concurrency.lockutils [None req-b706857f-4b7b-40eb-bcd3-59fb5cbecb9d 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:27 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd-userdata-shm.mount: Deactivated successfully.
Jan 20 09:45:27 np0005588920 systemd[1]: var-lib-containers-storage-overlay-9cb2ff7f18e2161082b161abc65eeae2be1f694c223f062d2a4294cde62396ba-merged.mount: Deactivated successfully.
Jan 20 09:45:27 np0005588920 podman[259827]: 2026-01-20 14:45:27.268820565 +0000 UTC m=+0.317202392 container cleanup 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:45:27 np0005588920 systemd[1]: libpod-conmon-0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd.scope: Deactivated successfully.
Jan 20 09:45:27 np0005588920 podman[259869]: 2026-01-20 14:45:27.336652606 +0000 UTC m=+0.039209244 container remove 0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.343 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6e9f05-a61e-4528-b8b5-0786e4368102]: (4, ('Tue Jan 20 02:45:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd)\n0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd\nTue Jan 20 02:45:27 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd)\n0399e8990e63e24c5fe5f327bb1380db307ce2674a0bec97118fc9cbd7ceb5dd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.345 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5412db3d-612c-4e7a-80e8-20595d97028f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.346 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:27 np0005588920 kernel: tapa19e9d1a-80: left promiscuous mode
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.367 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.371 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4be9b817-b25b-4ed4-9eb8-0c966d787ed8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.391 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0258d7b5-e724-4d7e-9e71-e09b47274d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[13f21e10-843e-48d4-9e9e-bc35cc861913]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.408 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f7e233-ee9d-4671-bef6-abff3a04a976]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536751, 'reachable_time': 24108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259886, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 systemd[1]: run-netns-ovnmeta\x2da19e9d1a\x2d864f\x2d41ee\x2dbdea\x2d188e65973ea5.mount: Deactivated successfully.
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.410 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:45:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:27.410 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb9c74c-70ec-4f99-83c1-b59a9826705e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:27.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.709 226890 DEBUG nova.network.neutron [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated VIF entry in instance network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.710 226890 DEBUG nova.network.neutron [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.744 226890 DEBUG oslo_concurrency.lockutils [req-579e879c-8b07-4983-b4b5-2fc4b2bd3c42 req-8d2f526a-7229-4da8-a8cf-d1d16cebab12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.745 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.745 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:45:27 np0005588920 nova_compute[226886]: 2026-01-20 14:45:27.745 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.055 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.058 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.058 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.058 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.059 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.059 226890 WARNING nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.060 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.060 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.060 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.061 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.061 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.061 226890 WARNING nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-unplugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.061 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.062 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.062 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.062 226890 DEBUG oslo_concurrency.lockutils [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.063 226890 DEBUG nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] No waiting events found dispatching network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:28 np0005588920 nova_compute[226886]: 2026-01-20 14:45:28.063 226890 WARNING nova.compute.manager [req-541eb22b-3dab-4867-b835-252b19039c5f req-73084f13-40ee-4d11-9176-7641d9a184ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received unexpected event network-vif-plugged-4de545f7-326a-4971-87cd-a23be2cbce6a for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:45:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:28.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:29 np0005588920 nova_compute[226886]: 2026-01-20 14:45:29.367 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:29.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:29 np0005588920 nova_compute[226886]: 2026-01-20 14:45:29.649 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:29 np0005588920 nova_compute[226886]: 2026-01-20 14:45:29.679 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:29 np0005588920 nova_compute[226886]: 2026-01-20 14:45:29.680 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:45:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:30.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.316 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.317 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.317 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.317 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.318 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.319 226890 INFO nova.compute.manager [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Terminating instance#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.320 226890 DEBUG nova.compute.manager [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.325 226890 INFO nova.virt.libvirt.driver [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Instance destroyed successfully.#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.325 226890 DEBUG nova.objects.instance [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid cc7de61a-b40f-4367-873d-c51b6f29310b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.340 226890 DEBUG nova.virt.libvirt.vif [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1867010105',display_name='tempest-tempest.common.compute-instance-1867010105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1867010105',id=90,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-kc7qns9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:45:27Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=cc7de61a-b40f-4367-873d-c51b6f29310b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.341 226890 DEBUG nova.network.os_vif_util [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "4de545f7-326a-4971-87cd-a23be2cbce6a", "address": "fa:16:3e:19:9b:8e", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de545f7-32", "ovs_interfaceid": "4de545f7-326a-4971-87cd-a23be2cbce6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.342 226890 DEBUG nova.network.os_vif_util [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.342 226890 DEBUG os_vif [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.344 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4de545f7-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.347 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.349 226890 INFO os_vif [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:9b:8e,bridge_name='br-int',has_traffic_filtering=True,id=4de545f7-326a-4971-87cd-a23be2cbce6a,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de545f7-32')#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.567 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.796 226890 INFO nova.virt.libvirt.driver [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deleting instance files /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b_del#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.797 226890 INFO nova.virt.libvirt.driver [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deletion of /var/lib/nova/instances/cc7de61a-b40f-4367-873d-c51b6f29310b_del complete#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.836 226890 INFO nova.compute.manager [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.837 226890 DEBUG oslo.service.loopingcall [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.837 226890 DEBUG nova.compute.manager [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:45:30 np0005588920 nova_compute[226886]: 2026-01-20 14:45:30.838 226890 DEBUG nova.network.neutron [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:45:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:31.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.770 226890 DEBUG nova.network.neutron [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.798 226890 INFO nova.compute.manager [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Took 0.96 seconds to deallocate network for instance.#033[00m
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.859 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.860 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:31 np0005588920 nova_compute[226886]: 2026-01-20 14:45:31.928 226890 DEBUG oslo_concurrency.processutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:32.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1381354546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.361 226890 DEBUG oslo_concurrency.processutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.367 226890 DEBUG nova.compute.manager [req-76f1d35a-dfdd-49a8-b543-38912667ea7d req-16088e9c-61fe-4bec-878b-9190fd7b8fc5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Received event network-vif-deleted-4de545f7-326a-4971-87cd-a23be2cbce6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.369 226890 DEBUG nova.compute.provider_tree [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.392 226890 DEBUG nova.scheduler.client.report [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.415 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.444 226890 INFO nova.scheduler.client.report [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocations for instance cc7de61a-b40f-4367-873d-c51b6f29310b#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.503 226890 DEBUG oslo_concurrency.lockutils [None req-ade4c578-759b-412c-933f-59514c15514f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "cc7de61a-b40f-4367-873d-c51b6f29310b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.777 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.777 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.777 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.778 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:45:32 np0005588920 nova_compute[226886]: 2026-01-20 14:45:32.778 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:33 np0005588920 podman[259930]: 2026-01-20 14:45:33.003169447 +0000 UTC m=+0.084087565 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Jan 20 09:45:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1341277129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.238 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.313 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.313 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.469 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.470 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4372MB free_disk=20.879981994628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.471 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.471 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:33.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.526 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 75736b87-b14e-45b7-b43b-5129cf7d3279 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.526 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.526 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:45:33 np0005588920 nova_compute[226886]: 2026-01-20 14:45:33.580 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/263069660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:34 np0005588920 nova_compute[226886]: 2026-01-20 14:45:34.079 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:34 np0005588920 nova_compute[226886]: 2026-01-20 14:45:34.084 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:34 np0005588920 nova_compute[226886]: 2026-01-20 14:45:34.100 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:34 np0005588920 nova_compute[226886]: 2026-01-20 14:45:34.120 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:45:34 np0005588920 nova_compute[226886]: 2026-01-20 14:45:34.120 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:34.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:35 np0005588920 nova_compute[226886]: 2026-01-20 14:45:35.120 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:35 np0005588920 nova_compute[226886]: 2026-01-20 14:45:35.121 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:45:35 np0005588920 nova_compute[226886]: 2026-01-20 14:45:35.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:35.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:35 np0005588920 nova_compute[226886]: 2026-01-20 14:45:35.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:35Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:45:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:35Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:45:35 np0005588920 nova_compute[226886]: 2026-01-20 14:45:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:45:35Z|00370|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:45:36 np0005588920 nova_compute[226886]: 2026-01-20 14:45:36.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:36.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 20 09:45:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:36 np0005588920 nova_compute[226886]: 2026-01-20 14:45:36.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 20 09:45:37 np0005588920 nova_compute[226886]: 2026-01-20 14:45:37.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:38.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:39.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 20 09:45:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:40.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:40 np0005588920 nova_compute[226886]: 2026-01-20 14:45:40.353 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:40 np0005588920 nova_compute[226886]: 2026-01-20 14:45:40.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:40 np0005588920 nova_compute[226886]: 2026-01-20 14:45:40.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:41.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:42 np0005588920 nova_compute[226886]: 2026-01-20 14:45:42.036 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920327.034771, cc7de61a-b40f-4367-873d-c51b6f29310b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:42 np0005588920 nova_compute[226886]: 2026-01-20 14:45:42.036 226890 INFO nova.compute.manager [-] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:45:42 np0005588920 nova_compute[226886]: 2026-01-20 14:45:42.058 226890 DEBUG nova.compute.manager [None req-0e120a17-e448-43ae-8c03-b68f1d60dcfc - - - - - -] [instance: cc7de61a-b40f-4367-873d-c51b6f29310b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:43.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:44.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:45 np0005588920 nova_compute[226886]: 2026-01-20 14:45:45.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:45 np0005588920 nova_compute[226886]: 2026-01-20 14:45:45.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:45.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:45 np0005588920 nova_compute[226886]: 2026-01-20 14:45:45.607 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:45 np0005588920 podman[259999]: 2026-01-20 14:45:45.955954086 +0000 UTC m=+0.048679688 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:45:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:46.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:48.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 20 09:45:49 np0005588920 nova_compute[226886]: 2026-01-20 14:45:49.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:49.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:50.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:50 np0005588920 nova_compute[226886]: 2026-01-20 14:45:50.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588920 nova_compute[226886]: 2026-01-20 14:45:50.609 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:50.824 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:50.825 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:45:50 np0005588920 nova_compute[226886]: 2026-01-20 14:45:50.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:51.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:52.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:45:52.827 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:53.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:53 np0005588920 nova_compute[226886]: 2026-01-20 14:45:53.890 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:54.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 20 09:45:55 np0005588920 nova_compute[226886]: 2026-01-20 14:45:55.360 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:55.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:55 np0005588920 nova_compute[226886]: 2026-01-20 14:45:55.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:56.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:57 np0005588920 nova_compute[226886]: 2026-01-20 14:45:57.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:57.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:58.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.858 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.859 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.859 226890 INFO nova.compute.manager [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Rebooting instance#033[00m
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.871 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.871 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:58 np0005588920 nova_compute[226886]: 2026-01-20 14:45:58.871 226890 DEBUG nova.network.neutron [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:45:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:45:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:59.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:00.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.416 226890 DEBUG nova.network.neutron [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.430 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.432 226890 DEBUG nova.compute.manager [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.613 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 kernel: tapd3a9a684-c9 (unregistering): left promiscuous mode
Jan 20 09:46:00 np0005588920 NetworkManager[49076]: <info>  [1768920360.6416] device (tapd3a9a684-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:00Z|00371|binding|INFO|Releasing lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 from this chassis (sb_readonly=0)
Jan 20 09:46:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:00Z|00372|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 down in Southbound
Jan 20 09:46:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:00Z|00373|binding|INFO|Removing iface tapd3a9a684-c9 ovn-installed in OVS
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.660 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.670 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.671 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.673 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.675 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bad0b965-d9d6-41aa-a890-60f9f74a9f30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.676 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.683 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 20 09:46:00 np0005588920 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005e.scope: Consumed 13.915s CPU time.
Jan 20 09:46:00 np0005588920 systemd-machined[196121]: Machine qemu-39-instance-0000005e terminated.
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.757 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.763 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.768 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.769 226890 DEBUG nova.objects.instance [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.781 226890 DEBUG nova.virt.libvirt.vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.782 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.783 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.783 226890 DEBUG os_vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.785 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.788 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.790 226890 INFO os_vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.795 226890 DEBUG nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start _get_guest_xml network_info=[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.798 226890 WARNING nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.803 226890 DEBUG nova.virt.libvirt.host [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:46:00 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [NOTICE]   (259354) : haproxy version is 2.8.14-c23fe91
Jan 20 09:46:00 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [NOTICE]   (259354) : path to executable is /usr/sbin/haproxy
Jan 20 09:46:00 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [WARNING]  (259354) : Exiting Master process...
Jan 20 09:46:00 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [ALERT]    (259354) : Current worker (259356) exited with code 143 (Terminated)
Jan 20 09:46:00 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[259350]: [WARNING]  (259354) : All workers exited. Exiting... (0)
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.804 226890 DEBUG nova.virt.libvirt.host [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:46:00 np0005588920 systemd[1]: libpod-aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34.scope: Deactivated successfully.
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.809 226890 DEBUG nova.virt.libvirt.host [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.809 226890 DEBUG nova.virt.libvirt.host [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.810 226890 DEBUG nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.810 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.811 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.811 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.811 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.811 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.812 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.812 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.812 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.813 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.813 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.813 226890 DEBUG nova.virt.hardware [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.813 226890 DEBUG nova.objects.instance [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:00 np0005588920 podman[260044]: 2026-01-20 14:46:00.814670796 +0000 UTC m=+0.052374291 container died aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.827 226890 DEBUG oslo_concurrency.processutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34-userdata-shm.mount: Deactivated successfully.
Jan 20 09:46:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay-48df6bf2eed0decf8cd70a6cf35494807ab4e0f52710a6f2cf43bdce0e1b5452-merged.mount: Deactivated successfully.
Jan 20 09:46:00 np0005588920 podman[260044]: 2026-01-20 14:46:00.848267563 +0000 UTC m=+0.085971058 container cleanup aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:46:00 np0005588920 systemd[1]: libpod-conmon-aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34.scope: Deactivated successfully.
Jan 20 09:46:00 np0005588920 podman[260081]: 2026-01-20 14:46:00.906864506 +0000 UTC m=+0.037994170 container remove aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.912 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60cc8436-9b76-48e1-9893-38cd71115787]: (4, ('Tue Jan 20 02:46:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34)\naaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34\nTue Jan 20 02:46:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (aaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34)\naaac9033398261ad9a81d39e95ab7ae4331e49107137a31dd2e895fa3cd73f34\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.913 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[df6c05d3-270f-4099-9a90-d3c4cc6978d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.914 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:00 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.923 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[79497adb-03f9-4678-aace-ceb1c43f94e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 nova_compute[226886]: 2026-01-20 14:46:00.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.941 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0b424b-cf39-4c5c-b888-580b3d4cc714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.943 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[296cbe8e-d098-4db5-a748-50ffc4752fe9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.957 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78c5784f-167b-48bb-8925-430eeb622313]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536276, 'reachable_time': 27134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260097, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.960 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:46:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:00.960 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[4328f58c-f780-467f-b35f-c3f43e3811b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:00 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:46:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2979122474' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:01 np0005588920 nova_compute[226886]: 2026-01-20 14:46:01.271 226890 DEBUG oslo_concurrency.processutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:01 np0005588920 nova_compute[226886]: 2026-01-20 14:46:01.302 226890 DEBUG oslo_concurrency.processutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:01.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/113448044' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.033 226890 DEBUG oslo_concurrency.processutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.731s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.036 226890 DEBUG nova.virt.libvirt.vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.037 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.038 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.041 226890 DEBUG nova.objects.instance [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.193 226890 DEBUG nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <uuid>75736b87-b14e-45b7-b43b-5129cf7d3279</uuid>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <name>instance-0000005e</name>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1202945337</nova:name>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:46:00</nova:creationTime>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <nova:port uuid="d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="serial">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="uuid">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:22:f9:d2"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <target dev="tapd3a9a684-c9"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log" append="off"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:46:02 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:02 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:02 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:02 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.194 226890 DEBUG nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.195 226890 DEBUG nova.virt.libvirt.driver [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.196 226890 DEBUG nova.virt.libvirt.vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.196 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.197 226890 DEBUG nova.network.os_vif_util [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.197 226890 DEBUG os_vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.198 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.198 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.199 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.202 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a9a684-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.202 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a9a684-c9, col_values=(('external_ids', {'iface-id': 'd3a9a684-c9a7-4abc-a085-9dcd17bfc2e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:f9:d2', 'vm-uuid': '75736b87-b14e-45b7-b43b-5129cf7d3279'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.204 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.2050] manager: (tapd3a9a684-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.206 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.209 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.209 226890 INFO os_vif [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:46:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:02.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:02 np0005588920 kernel: tapd3a9a684-c9: entered promiscuous mode
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.3458] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Jan 20 09:46:02 np0005588920 systemd-udevd[260022]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:02Z|00374|binding|INFO|Claiming lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for this chassis.
Jan 20 09:46:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:02Z|00375|binding|INFO|d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6: Claiming fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.3586] device (tapd3a9a684-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.3600] device (tapd3a9a684-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.368 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.369 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.371 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:02Z|00376|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 ovn-installed in OVS
Jan 20 09:46:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:02Z|00377|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 up in Southbound
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 systemd-machined[196121]: New machine qemu-41-instance-0000005e.
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.384 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb8fe77-6e5f-443c-a230-a3fdc9a355ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.385 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.386 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.387 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2611db9-4889-4ad2-af1c-c973352ccb49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.388 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[adb663a4-ff8e-4acf-a025-09ba34aaa5e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 systemd[1]: Started Virtual Machine qemu-41-instance-0000005e.
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.400 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[513cd05e-ac7a-45c0-b5a9-9c4b0fb9c863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.415 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fa425a40-ab23-4bda-87ba-66aa7a7672e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.448 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[437ec483-9362-4154-9b95-01142e90cdb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.4532] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/197)
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.452 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ea29324c-a0c7-404e-8f5c-c4e62bf09971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.487 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b7b6c8-e628-4e16-a892-5f9727a688f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.491 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b495b166-0e0e-4c9e-ac2d-e2445064d193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.5117] device (tap762e1859-40): carrier: link connected
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.518 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[070733fb-659c-4f85-9a34-77bb89839ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.534 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb2aa8c-7a0a-4e07-96ed-e70483334097]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540419, 'reachable_time': 33336, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260204, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.548 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a288f9f4-7f33-4da9-af15-72cd146d653d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540419, 'tstamp': 540419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260205, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.562 226890 DEBUG nova.compute.manager [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.562 226890 DEBUG oslo_concurrency.lockutils [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.562 226890 DEBUG oslo_concurrency.lockutils [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.563 226890 DEBUG oslo_concurrency.lockutils [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.563 226890 DEBUG nova.compute.manager [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.563 226890 WARNING nova.compute.manager [req-9d48b7a7-88ed-481e-ab1e-e626e101d108 req-a9041ffb-b6d1-453e-ae4d-ef7189db39da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.567 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6979de-5308-461d-926e-a3cd017ef368]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540419, 'reachable_time': 33336, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260206, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.600 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc764fa-69be-4bb1-99c5-d93f16283805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.662 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[23aab794-1f7c-45f8-824f-c513312572ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.663 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.664 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.664 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 NetworkManager[49076]: <info>  [1768920362.6667] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.669 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:02Z|00378|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.673 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.673 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82c13815-a38f-4162-8e2c-9c98eddf7995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.674 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:46:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:02.676 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:46:02 np0005588920 nova_compute[226886]: 2026-01-20 14:46:02.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:03 np0005588920 podman[260262]: 2026-01-20 14:46:03.002412002 +0000 UTC m=+0.020115002 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:46:03 np0005588920 podman[260262]: 2026-01-20 14:46:03.474174642 +0000 UTC m=+0.491877632 container create 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:46:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:03.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.719 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 75736b87-b14e-45b7-b43b-5129cf7d3279 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.720 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920363.7193875, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.720 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.722 226890 DEBUG nova.compute.manager [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.725 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance rebooted successfully.#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.725 226890 DEBUG nova.compute.manager [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.818 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.822 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.863 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.863 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920363.7221582, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.863 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Started (Lifecycle Event)#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.866 226890 DEBUG oslo_concurrency.lockutils [None req-59fa0485-7c25-4f49-8ae6-efcac6f014f6 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.882 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:03 np0005588920 nova_compute[226886]: 2026-01-20 14:46:03.886 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:04.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.659 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.660 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.660 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.660 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.660 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.661 226890 WARNING nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.661 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.661 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.662 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.662 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.662 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.662 226890 WARNING nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.663 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.663 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.663 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.664 226890 DEBUG oslo_concurrency.lockutils [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.664 226890 DEBUG nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:04 np0005588920 nova_compute[226886]: 2026-01-20 14:46:04.664 226890 WARNING nova.compute.manager [req-1fab213c-c4cf-446e-9660-96983a9c1612 req-0ab3ec78-9a14-4721-abdb-7724b029f399 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 20 09:46:05 np0005588920 systemd[1]: Started libpod-conmon-0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499.scope.
Jan 20 09:46:05 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:46:05 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaf85e64b76581f5d51c6da2b1bb2d2f13e80a5addb67c456c320362b323fcea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:46:05 np0005588920 podman[260262]: 2026-01-20 14:46:05.188165913 +0000 UTC m=+2.205868913 container init 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:46:05 np0005588920 podman[260262]: 2026-01-20 14:46:05.196128355 +0000 UTC m=+2.213831355 container start 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:46:05 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [NOTICE]   (260310) : New worker (260312) forked
Jan 20 09:46:05 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [NOTICE]   (260310) : Loading success.
Jan 20 09:46:05 np0005588920 podman[260286]: 2026-01-20 14:46:05.541470301 +0000 UTC m=+2.129173744 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 20 09:46:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:05.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:05 np0005588920 nova_compute[226886]: 2026-01-20 14:46:05.615 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:05 np0005588920 nova_compute[226886]: 2026-01-20 14:46:05.714 226890 INFO nova.compute.manager [None req-2c1ffec7-ab6a-4234-960c-e1b6c4a23cda 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Get console output#033[00m
Jan 20 09:46:05 np0005588920 nova_compute[226886]: 2026-01-20 14:46:05.719 226890 INFO oslo.privsep.daemon [None req-2c1ffec7-ab6a-4234-960c-e1b6c4a23cda 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpjvmwf32j/privsep.sock']#033[00m
Jan 20 09:46:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:06.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.630 226890 INFO oslo.privsep.daemon [None req-2c1ffec7-ab6a-4234-960c-e1b6c4a23cda 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.510 260344 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.515 260344 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.518 260344 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.518 260344 INFO oslo.privsep.daemon [-] privsep daemon running as pid 260344#033[00m
Jan 20 09:46:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:06 np0005588920 nova_compute[226886]: 2026-01-20 14:46:06.727 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 09:46:07 np0005588920 nova_compute[226886]: 2026-01-20 14:46:07.322 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:07.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:08.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.555 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.556 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.579 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.664 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.665 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.672 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:46:08 np0005588920 nova_compute[226886]: 2026-01-20 14:46:08.672 226890 INFO nova.compute.claims [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:46:09 np0005588920 nova_compute[226886]: 2026-01-20 14:46:09.057 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:09.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2347137275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.092 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.099 226890 DEBUG nova.compute.provider_tree [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.117 226890 DEBUG nova.scheduler.client.report [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.145 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.147 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.194 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.195 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.215 226890 INFO nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.232 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.326 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.327 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.328 226890 INFO nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Creating image(s)#033[00m
Jan 20 09:46:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:10.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.347 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.373 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.400 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.403 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.430 226890 DEBUG nova.policy [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5cd9508688214bedb977528f8b6f95d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7bbf722f17654404925cfb53e48cd473', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.465 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.466 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.466 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:10 np0005588920 nova_compute[226886]: 2026-01-20 14:46:10.467 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.259 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.264 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.290 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.331 226890 DEBUG oslo_concurrency.lockutils [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.332 226890 DEBUG oslo_concurrency.lockutils [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.332 226890 DEBUG nova.compute.manager [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.336 226890 DEBUG nova.compute.manager [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.337 226890 DEBUG nova.objects.instance [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:11 np0005588920 nova_compute[226886]: 2026-01-20 14:46:11.367 226890 DEBUG nova.virt.libvirt.driver [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:46:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:11.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:12 np0005588920 nova_compute[226886]: 2026-01-20 14:46:12.325 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:12.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:12 np0005588920 nova_compute[226886]: 2026-01-20 14:46:12.878 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Successfully created port: fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:46:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:13.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:13 np0005588920 nova_compute[226886]: 2026-01-20 14:46:13.974 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Successfully updated port: fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:46:13 np0005588920 nova_compute[226886]: 2026-01-20 14:46:13.987 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:13 np0005588920 nova_compute[226886]: 2026-01-20 14:46:13.987 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:13 np0005588920 nova_compute[226886]: 2026-01-20 14:46:13.988 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:46:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:46:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.152 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.157 226890 DEBUG nova.compute.manager [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-changed-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.157 226890 DEBUG nova.compute.manager [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing instance network info cache due to event network-changed-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.158 226890 DEBUG oslo_concurrency.lockutils [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:14.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.799 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:14 np0005588920 nova_compute[226886]: 2026-01-20 14:46:14.860 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] resizing rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.189 226890 DEBUG nova.network.neutron [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.212 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.212 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Instance network_info: |[{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.213 226890 DEBUG oslo_concurrency.lockutils [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.213 226890 DEBUG nova.network.neutron [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing network info cache for port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.501 226890 DEBUG nova.objects.instance [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.515 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.515 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Ensure instance console log exists: /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.516 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.516 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.517 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.519 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Start _get_guest_xml network_info=[{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.524 226890 WARNING nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.529 226890 DEBUG nova.virt.libvirt.host [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.530 226890 DEBUG nova.virt.libvirt.host [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.534 226890 DEBUG nova.virt.libvirt.host [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.535 226890 DEBUG nova.virt.libvirt.host [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.537 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.537 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.538 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.538 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.538 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.539 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.539 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.539 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.540 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.540 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.540 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.541 226890 DEBUG nova.virt.hardware [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.544 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:15.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:15 np0005588920 nova_compute[226886]: 2026-01-20 14:46:15.618 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2610441827' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.005 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.028 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.033 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:16.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:16.449 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1110613297' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.483 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.485 226890 DEBUG nova.virt.libvirt.vif [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:46:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.485 226890 DEBUG nova.network.os_vif_util [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.486 226890 DEBUG nova.network.os_vif_util [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.487 226890 DEBUG nova.objects.instance [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.502 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <uuid>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</uuid>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <name>instance-00000061</name>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:46:15</nova:creationTime>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="serial">6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="uuid">6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:80:88:4e"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <target dev="tapfc4a2805-d7"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log" append="off"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:46:16 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:16 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:16 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:16 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.503 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Preparing to wait for external event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.504 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.504 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.504 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.505 226890 DEBUG nova.virt.libvirt.vif [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:46:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.505 226890 DEBUG nova.network.os_vif_util [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.505 226890 DEBUG nova.network.os_vif_util [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.506 226890 DEBUG os_vif [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.506 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.507 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.507 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.510 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc4a2805-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.510 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc4a2805-d7, col_values=(('external_ids', {'iface-id': 'fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:88:4e', 'vm-uuid': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:16 np0005588920 NetworkManager[49076]: <info>  [1768920376.5125] manager: (tapfc4a2805-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.518 226890 INFO os_vif [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7')#033[00m
Jan 20 09:46:16 np0005588920 podman[260730]: 2026-01-20 14:46:16.600103462 +0000 UTC m=+0.052759701 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.942 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.943 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.943 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No VIF found with MAC fa:16:3e:80:88:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:46:16 np0005588920 nova_compute[226886]: 2026-01-20 14:46:16.943 226890 INFO nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Using config drive#033[00m
Jan 20 09:46:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:17 np0005588920 nova_compute[226886]: 2026-01-20 14:46:17.315 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:17.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:18Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:46:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:18.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:18 np0005588920 nova_compute[226886]: 2026-01-20 14:46:18.613 226890 INFO nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Creating config drive at /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config#033[00m
Jan 20 09:46:18 np0005588920 nova_compute[226886]: 2026-01-20 14:46:18.619 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbto75p0h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:18 np0005588920 nova_compute[226886]: 2026-01-20 14:46:18.746 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbto75p0h" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:18 np0005588920 nova_compute[226886]: 2026-01-20 14:46:18.774 226890 DEBUG nova.storage.rbd_utils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] rbd image 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:18 np0005588920 nova_compute[226886]: 2026-01-20 14:46:18.776 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:19.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:19 np0005588920 nova_compute[226886]: 2026-01-20 14:46:19.618 226890 DEBUG nova.network.neutron [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updated VIF entry in instance network info cache for port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:46:19 np0005588920 nova_compute[226886]: 2026-01-20 14:46:19.619 226890 DEBUG nova.network.neutron [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:19 np0005588920 nova_compute[226886]: 2026-01-20 14:46:19.640 226890 DEBUG oslo_concurrency.lockutils [req-1ba5d724-95e6-4979-aa45-ac866ba1b95f req-7774349e-7bb3-4b94-8819-4609c78832ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:20.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:20 np0005588920 nova_compute[226886]: 2026-01-20 14:46:20.621 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.098 226890 DEBUG oslo_concurrency.processutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.099 226890 INFO nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Deleting local config drive /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/disk.config because it was imported into RBD.#033[00m
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.1460] manager: (tapfc4a2805-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Jan 20 09:46:21 np0005588920 kernel: tapfc4a2805-d7: entered promiscuous mode
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:21Z|00379|binding|INFO|Claiming lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for this chassis.
Jan 20 09:46:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:21Z|00380|binding|INFO|fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1: Claiming fa:16:3e:80:88:4e 10.100.0.9
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.165 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:4e 10.100.0.9'], port_security=['fa:16:3e:80:88:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '2', 'neutron:security_group_ids': '34f09647-0a97-406a-bb20-4a478cae9ceb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=becf0890-2062-451f-a3f9-626953c24d96, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.167 144128 INFO neutron.agent.ovn.metadata.agent [-] Port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 in datapath c59c8bba-9fc6-441e-8b7d-cd5444901b2a bound to our chassis#033[00m
Jan 20 09:46:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:21Z|00381|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 ovn-installed in OVS
Jan 20 09:46:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:21Z|00382|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 up in Southbound
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.173 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.170 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c59c8bba-9fc6-441e-8b7d-cd5444901b2a#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.177 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 systemd-udevd[260821]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.181 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[54618d88-7fd9-4798-a9fc-f4bb487c3ebe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.182 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc59c8bba-91 in ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.183 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc59c8bba-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.184 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a076dc55-761f-4079-ad1e-8429721e493f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.184 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c3122cf8-0a88-4cfd-acce-cef4df24173e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 systemd-machined[196121]: New machine qemu-42-instance-00000061.
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.1920] device (tapfc4a2805-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.1925] device (tapfc4a2805-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.197 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2113d4-2196-4b79-ae87-3600673c98ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 systemd[1]: Started Virtual Machine qemu-42-instance-00000061.
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.222 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e67192b6-77a3-469a-9005-b829a5b949ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.250 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa9b592-0755-47fb-bd15-17d94628a80e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 systemd-udevd[260825]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.256 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cf7b2ec0-d44e-4870-a4c3-c9a73f817ac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.2574] manager: (tapc59c8bba-90): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.284 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[fbed9b72-400a-46af-b512-6125cabc175b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.287 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[076abc93-59cc-4037-b98b-73f8a8b39946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.3085] device (tapc59c8bba-90): carrier: link connected
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.313 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7d615c65-90ef-4ca7-94c0-b47b0babe57d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.330 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[12d881c5-cebb-477e-9eeb-35a04d0351b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc59c8bba-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:aa:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542299, 'reachable_time': 35399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260854, 'error': None, 'target': 'ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.345 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f03ea1f-18bd-40bc-9e96-0d9fa1ee00b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:aa8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542299, 'tstamp': 542299}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260855, 'error': None, 'target': 'ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.362 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed6cada-db3c-4bd9-84f1-f32c41753c37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc59c8bba-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:aa:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542299, 'reachable_time': 35399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260856, 'error': None, 'target': 'ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3347a308-cb0b-403b-a127-9f3fa38bff22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.412 226890 DEBUG nova.virt.libvirt.driver [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.447 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ee448866-3f1a-409f-8130-1940725eaabe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.449 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc59c8bba-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.449 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.450 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc59c8bba-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.451 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 kernel: tapc59c8bba-90: entered promiscuous mode
Jan 20 09:46:21 np0005588920 NetworkManager[49076]: <info>  [1768920381.4536] manager: (tapc59c8bba-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.455 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc59c8bba-90, col_values=(('external_ids', {'iface-id': 'fba04072-e3c6-4239-ae0e-7afd6e092d1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:21Z|00383|binding|INFO|Releasing lport fba04072-e3c6-4239-ae0e-7afd6e092d1a from this chassis (sb_readonly=0)
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.470 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.470 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c59c8bba-9fc6-441e-8b7d-cd5444901b2a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c59c8bba-9fc6-441e-8b7d-cd5444901b2a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.471 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[73d896dc-9abd-4aae-a956-c86f9b9e8d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.472 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-c59c8bba-9fc6-441e-8b7d-cd5444901b2a
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/c59c8bba-9fc6-441e-8b7d-cd5444901b2a.pid.haproxy
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID c59c8bba-9fc6-441e-8b7d-cd5444901b2a
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:46:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:21.473 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'env', 'PROCESS_TAG=haproxy-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c59c8bba-9fc6-441e-8b7d-cd5444901b2a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:21.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:21 np0005588920 podman[260979]: 2026-01-20 14:46:21.892437171 +0000 UTC m=+0.064018656 container create 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.912 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920381.9115212, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.912 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] VM Started (Lifecycle Event)#033[00m
Jan 20 09:46:21 np0005588920 systemd[1]: Started libpod-conmon-20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866.scope.
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.939 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.942 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920381.91183, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.943 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:46:21 np0005588920 podman[260979]: 2026-01-20 14:46:21.85117374 +0000 UTC m=+0.022755275 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.955 226890 DEBUG nova.compute.manager [req-51a870fb-650c-403b-83ef-c2ae2ecb1c3f req-c3105147-70dc-4d52-9e8f-fe23a4614047 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.955 226890 DEBUG oslo_concurrency.lockutils [req-51a870fb-650c-403b-83ef-c2ae2ecb1c3f req-c3105147-70dc-4d52-9e8f-fe23a4614047 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.955 226890 DEBUG oslo_concurrency.lockutils [req-51a870fb-650c-403b-83ef-c2ae2ecb1c3f req-c3105147-70dc-4d52-9e8f-fe23a4614047 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.955 226890 DEBUG oslo_concurrency.lockutils [req-51a870fb-650c-403b-83ef-c2ae2ecb1c3f req-c3105147-70dc-4d52-9e8f-fe23a4614047 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.956 226890 DEBUG nova.compute.manager [req-51a870fb-650c-403b-83ef-c2ae2ecb1c3f req-c3105147-70dc-4d52-9e8f-fe23a4614047 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Processing event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.956 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.959 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:46:21 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.962 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.963 226890 INFO nova.virt.libvirt.driver [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Instance spawned successfully.#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.964 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.966 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920381.9588015, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.966 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:46:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cd0c84eb320478cd48992671f02f9d7a597064f617d158a39e212c251bbcaba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:46:21 np0005588920 podman[260979]: 2026-01-20 14:46:21.979163628 +0000 UTC m=+0.150745093 container init 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:46:21 np0005588920 podman[260979]: 2026-01-20 14:46:21.984367743 +0000 UTC m=+0.155949188 container start 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.983 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.987 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.988 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.988 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.988 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.989 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.989 226890 DEBUG nova.virt.libvirt.driver [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:21 np0005588920 nova_compute[226886]: 2026-01-20 14:46:21.994 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:22 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [NOTICE]   (260999) : New worker (261001) forked
Jan 20 09:46:22 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [NOTICE]   (260999) : Loading success.
Jan 20 09:46:22 np0005588920 nova_compute[226886]: 2026-01-20 14:46:22.018 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:46:22 np0005588920 nova_compute[226886]: 2026-01-20 14:46:22.044 226890 INFO nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Took 11.72 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:46:22 np0005588920 nova_compute[226886]: 2026-01-20 14:46:22.045 226890 DEBUG nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:22 np0005588920 nova_compute[226886]: 2026-01-20 14:46:22.157 226890 INFO nova.compute.manager [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Took 13.53 seconds to build instance.#033[00m
Jan 20 09:46:22 np0005588920 nova_compute[226886]: 2026-01-20 14:46:22.283 226890 DEBUG oslo_concurrency.lockutils [None req-5fb0ae48-5901-4018-a90f-f534ab4dd41a 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:22.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:46:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:23.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:46:23 np0005588920 kernel: tapd3a9a684-c9 (unregistering): left promiscuous mode
Jan 20 09:46:23 np0005588920 NetworkManager[49076]: <info>  [1768920383.7829] device (tapd3a9a684-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:23 np0005588920 nova_compute[226886]: 2026-01-20 14:46:23.796 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:23Z|00384|binding|INFO|Releasing lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 from this chassis (sb_readonly=0)
Jan 20 09:46:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:23Z|00385|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 down in Southbound
Jan 20 09:46:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:23Z|00386|binding|INFO|Removing iface tapd3a9a684-c9 ovn-installed in OVS
Jan 20 09:46:23 np0005588920 nova_compute[226886]: 2026-01-20 14:46:23.800 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:23.812 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:23.813 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:46:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:23.814 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:23.819 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[935265d6-6ea9-4c86-aabd-7dbc13326f93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:23.819 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:46:23 np0005588920 nova_compute[226886]: 2026-01-20 14:46:23.832 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:23 np0005588920 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 20 09:46:23 np0005588920 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005e.scope: Consumed 13.866s CPU time.
Jan 20 09:46:23 np0005588920 systemd-machined[196121]: Machine qemu-41-instance-0000005e terminated.
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [NOTICE]   (260310) : haproxy version is 2.8.14-c23fe91
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [NOTICE]   (260310) : path to executable is /usr/sbin/haproxy
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [WARNING]  (260310) : Exiting Master process...
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [WARNING]  (260310) : Exiting Master process...
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [ALERT]    (260310) : Current worker (260312) exited with code 143 (Terminated)
Jan 20 09:46:23 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[260306]: [WARNING]  (260310) : All workers exited. Exiting... (0)
Jan 20 09:46:23 np0005588920 systemd[1]: libpod-0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499.scope: Deactivated successfully.
Jan 20 09:46:23 np0005588920 podman[261031]: 2026-01-20 14:46:23.98306945 +0000 UTC m=+0.050586931 container died 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:46:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499-userdata-shm.mount: Deactivated successfully.
Jan 20 09:46:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay-eaf85e64b76581f5d51c6da2b1bb2d2f13e80a5addb67c456c320362b323fcea-merged.mount: Deactivated successfully.
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.018 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:24 np0005588920 podman[261031]: 2026-01-20 14:46:24.029120574 +0000 UTC m=+0.096638065 container cleanup 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:46:24 np0005588920 systemd[1]: libpod-conmon-0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499.scope: Deactivated successfully.
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.037 226890 DEBUG nova.compute.manager [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.037 226890 DEBUG oslo_concurrency.lockutils [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.037 226890 DEBUG oslo_concurrency.lockutils [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.038 226890 DEBUG oslo_concurrency.lockutils [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.038 226890 DEBUG nova.compute.manager [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.038 226890 WARNING nova.compute.manager [req-eaba3a75-ef8a-47ff-bbc5-d93c47b5f8f9 req-e0574d17-ac6a-4b01-9924-62d8e3ae7adb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state powering-off.#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.039 226890 DEBUG nova.compute.manager [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.040 226890 DEBUG oslo_concurrency.lockutils [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.040 226890 DEBUG oslo_concurrency.lockutils [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.040 226890 DEBUG oslo_concurrency.lockutils [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.041 226890 DEBUG nova.compute.manager [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.041 226890 WARNING nova.compute.manager [req-cdc1bc08-24b5-4ffd-a55f-50a591ef3f32 req-b76ac46b-922e-4fb9-9d57-fe967269c779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:24 np0005588920 podman[261070]: 2026-01-20 14:46:24.092327746 +0000 UTC m=+0.042181497 container remove 0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.100 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[09e5e9bd-237c-4134-a126-25c4014a87e5]: (4, ('Tue Jan 20 02:46:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499)\n0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499\nTue Jan 20 02:46:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499)\n0d0f295407dcbc6c66f72289fb95ceee29f314264a6efeaf353b7477a2f1d499\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.102 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[002b7804-48b0-4c06-b6d1-37416af00486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.103 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.104 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:24 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.125 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0610a980-023c-4456-9995-4e6afbcd88d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.139 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5df7ac65-e69a-4f75-837c-bd31ab10a4f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.140 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[31abc62b-e66d-4a1b-a1bc-add1b8974d1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.155 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2412d96c-f4ef-46cf-8ca3-caf310d8229a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540412, 'reachable_time': 29706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261089, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.158 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:46:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:24.159 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7d31e1-3441-4fd0-9d59-f2c7e4556bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:24.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.428 226890 INFO nova.virt.libvirt.driver [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.434 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.435 226890 DEBUG nova.objects.instance [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'numa_topology' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.447 226890 DEBUG nova.compute.manager [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:24 np0005588920 nova_compute[226886]: 2026-01-20 14:46:24.484 226890 DEBUG oslo_concurrency.lockutils [None req-a1ab864e-59a3-4f6a-b18b-0dc2c09fc375 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.062 226890 DEBUG nova.compute.manager [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-changed-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.063 226890 DEBUG nova.compute.manager [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing instance network info cache due to event network-changed-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.063 226890 DEBUG oslo_concurrency.lockutils [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.063 226890 DEBUG oslo_concurrency.lockutils [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.064 226890 DEBUG nova.network.neutron [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing network info cache for port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:46:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:25.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.622 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.740 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.760 226890 DEBUG oslo_concurrency.lockutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.760 226890 DEBUG oslo_concurrency.lockutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.761 226890 DEBUG nova.network.neutron [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:46:25 np0005588920 nova_compute[226886]: 2026-01-20 14:46:25.761 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'info_cache' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.193 226890 DEBUG nova.compute.manager [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.193 226890 DEBUG oslo_concurrency.lockutils [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.193 226890 DEBUG oslo_concurrency.lockutils [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.194 226890 DEBUG oslo_concurrency.lockutils [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.194 226890 DEBUG nova.compute.manager [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.194 226890 WARNING nova.compute.manager [req-a479628d-b8b7-4e9f-afe4-3b43620d0f20 req-8d27c20e-bc0d-4167-a5ce-d0eae78f1cfb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 09:46:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:26.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:26 np0005588920 nova_compute[226886]: 2026-01-20 14:46:26.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.098 226890 DEBUG nova.network.neutron [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updated VIF entry in instance network info cache for port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.099 226890 DEBUG nova.network.neutron [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.121 226890 DEBUG oslo_concurrency.lockutils [req-86f9f61b-4d0c-4acc-86a1-7ac9a184c167 req-90f1955f-db89-4fbe-b692-858a14ee89e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:27.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:46:27 np0005588920 nova_compute[226886]: 2026-01-20 14:46:27.769 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.027 226890 DEBUG nova.network.neutron [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.049 226890 DEBUG oslo_concurrency.lockutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.079 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.079 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'numa_topology' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.096 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.118 226890 DEBUG nova.virt.libvirt.vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.119 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.120 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.121 226890 DEBUG os_vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.124 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.127 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.130 226890 INFO os_vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.138 226890 DEBUG nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start _get_guest_xml network_info=[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.142 226890 WARNING nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.155 226890 DEBUG nova.virt.libvirt.host [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.156 226890 DEBUG nova.virt.libvirt.host [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.162 226890 DEBUG nova.virt.libvirt.host [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.162 226890 DEBUG nova.virt.libvirt.host [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.163 226890 DEBUG nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.163 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.164 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.164 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.164 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.164 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.164 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.165 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.165 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.165 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.165 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.165 226890 DEBUG nova.virt.hardware [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.166 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.184 226890 DEBUG oslo_concurrency.processutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:28.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3610685573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.609 226890 DEBUG oslo_concurrency.processutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:28 np0005588920 nova_compute[226886]: 2026-01-20 14:46:28.651 226890 DEBUG oslo_concurrency.processutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3828261511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.084 226890 DEBUG oslo_concurrency.processutils [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.086 226890 DEBUG nova.virt.libvirt.vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.086 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.087 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.088 226890 DEBUG nova.objects.instance [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.107 226890 DEBUG nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <uuid>75736b87-b14e-45b7-b43b-5129cf7d3279</uuid>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <name>instance-0000005e</name>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1202945337</nova:name>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:46:28</nova:creationTime>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <nova:port uuid="d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="serial">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="uuid">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:22:f9:d2"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <target dev="tapd3a9a684-c9"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log" append="off"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:46:29 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:29 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:29 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:29 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.109 226890 DEBUG nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.109 226890 DEBUG nova.virt.libvirt.driver [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.110 226890 DEBUG nova.virt.libvirt.vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.111 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.111 226890 DEBUG nova.network.os_vif_util [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.112 226890 DEBUG os_vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.112 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.113 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.113 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.116 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a9a684-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.117 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a9a684-c9, col_values=(('external_ids', {'iface-id': 'd3a9a684-c9a7-4abc-a085-9dcd17bfc2e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:f9:d2', 'vm-uuid': '75736b87-b14e-45b7-b43b-5129cf7d3279'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.118 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.1193] manager: (tapd3a9a684-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.120 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.124 226890 INFO os_vif [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:46:29 np0005588920 kernel: tapd3a9a684-c9: entered promiscuous mode
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.183 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:29Z|00387|binding|INFO|Claiming lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for this chassis.
Jan 20 09:46:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:29Z|00388|binding|INFO|d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6: Claiming fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.1917] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.194 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.195 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.197 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.208 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[05055941-b5df-4130-abdc-8c4825faa11f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.209 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:29Z|00389|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 ovn-installed in OVS
Jan 20 09:46:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:29Z|00390|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 up in Southbound
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.211 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.211 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6ca783-4e91-4898-8af2-638276931cd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.213 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[49bd40a2-ac00-4d8b-a1a3-713a7bfb3ab7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 systemd-machined[196121]: New machine qemu-43-instance-0000005e.
Jan 20 09:46:29 np0005588920 systemd[1]: Started Virtual Machine qemu-43-instance-0000005e.
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.231 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[685bae33-1d5e-4b9a-905a-92e7d7cd683c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 systemd-udevd[261170]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.248 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[13f352aa-2429-4dd2-b55d-4da305df1854]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.2585] device (tapd3a9a684-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.2595] device (tapd3a9a684-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.284 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[86bb5d35-b015-4729-a526-af5f4d6d1393]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.2921] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.291 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9c432d3a-8e9c-4bd5-8ed7-597d076cf2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 systemd-udevd[261174]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.334 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c252cf83-c13d-4caa-807d-948efef3d0fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.337 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[51cad676-ab35-48be-a3c2-03123c651bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.3614] device (tap762e1859-40): carrier: link connected
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.369 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6f008cc7-ec6b-44d5-bdf4-13a1e0b093f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.386 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bce2959e-9f97-4016-bca5-381e45edca91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543104, 'reachable_time': 31016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261200, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.408 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5c4101-a1a4-4579-b7d4-b4cb150449a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 543104, 'tstamp': 543104}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261201, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.431 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb8bb17-09fb-4a8b-9c50-aa8a70f8a38f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543104, 'reachable_time': 31016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261202, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.468 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[41adcc81-84c3-4b1a-bfe0-12bba12624cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2519742879' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.550 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7d71e3c7-e6fb-47a1-aa7c-19d2e837a3b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.552 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.553 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.554 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.556 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:46:29 np0005588920 NetworkManager[49076]: <info>  [1768920389.5588] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.566 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.568 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:29Z|00391|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.572 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.574 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8935ae70-ab8d-4af2-b320-4b8721dbea1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.576 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:46:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:29.578 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:46:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.592 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.835 226890 DEBUG nova.compute.manager [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.836 226890 DEBUG oslo_concurrency.lockutils [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.836 226890 DEBUG oslo_concurrency.lockutils [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.837 226890 DEBUG oslo_concurrency.lockutils [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.837 226890 DEBUG nova.compute.manager [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:29 np0005588920 nova_compute[226886]: 2026-01-20 14:46:29.837 226890 WARNING nova.compute.manager [req-19536d38-d737-44e0-ad56-4f49311acdf7 req-8891b008-9f83-45cb-b837-0010fee71688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 09:46:30 np0005588920 podman[261232]: 2026-01-20 14:46:29.916612273 +0000 UTC m=+0.028602478 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:46:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:30.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:30 np0005588920 podman[261232]: 2026-01-20 14:46:30.504058979 +0000 UTC m=+0.616049164 container create 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:46:30 np0005588920 systemd[1]: Started libpod-conmon-24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb.scope.
Jan 20 09:46:30 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:46:30 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47062e0df0cfa1d0ea402d5c1307c6fe84cf2a90848f6933450d9fe1e2b4a2ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:46:30 np0005588920 podman[261232]: 2026-01-20 14:46:30.57369276 +0000 UTC m=+0.685682965 container init 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:46:30 np0005588920 podman[261232]: 2026-01-20 14:46:30.580043507 +0000 UTC m=+0.692033692 container start 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:46:30 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [NOTICE]   (261294) : New worker (261297) forked
Jan 20 09:46:30 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [NOTICE]   (261294) : Loading success.
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.626 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.635 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 75736b87-b14e-45b7-b43b-5129cf7d3279 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.635 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920390.6348112, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.636 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.638 226890 DEBUG nova.compute.manager [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.642 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance rebooted successfully.#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.643 226890 DEBUG nova.compute.manager [None req-05782539-6dae-4253-8edc-e28015f32ec9 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.763 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.766 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.936 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920390.63626, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.937 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Started (Lifecycle Event)#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.964 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:30 np0005588920 nova_compute[226886]: 2026-01-20 14:46:30.968 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:31.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.049 226890 DEBUG nova.compute.manager [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.050 226890 DEBUG oslo_concurrency.lockutils [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.051 226890 DEBUG oslo_concurrency.lockutils [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.051 226890 DEBUG oslo_concurrency.lockutils [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.052 226890 DEBUG nova.compute.manager [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.052 226890 WARNING nova.compute.manager [req-5154f22f-093f-40f3-94c5-5c3255bbedb9 req-2be27284-9361-47e2-bce4-36e455389018 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:32.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:32 np0005588920 nova_compute[226886]: 2026-01-20 14:46:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:33 np0005588920 nova_compute[226886]: 2026-01-20 14:46:33.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:33 np0005588920 nova_compute[226886]: 2026-01-20 14:46:33.760 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.119 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:46:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.758 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.760 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.761 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.938 226890 INFO nova.compute.manager [None req-05c97dc8-9c43-4028-a41b-f19f27ddca80 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Pausing#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.939 226890 DEBUG nova.objects.instance [None req-05c97dc8-9c43-4028-a41b-f19f27ddca80 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.963 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920394.9633312, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.963 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.965 226890 DEBUG nova.compute.manager [None req-05c97dc8-9c43-4028-a41b-f19f27ddca80 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.990 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:34 np0005588920 nova_compute[226886]: 2026-01-20 14:46:34.995 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.021 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 20 09:46:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3432940160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.219 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.283 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.283 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.286 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.286 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.473 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.475 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4235MB free_disk=20.798847198486328GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.475 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.476 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:35Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:88:4e 10.100.0.9
Jan 20 09:46:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:35Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:88:4e 10.100.0.9
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.538 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 75736b87-b14e-45b7-b43b-5129cf7d3279 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.538 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.539 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.539 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.591 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:46:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:35.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:46:35 np0005588920 nova_compute[226886]: 2026-01-20 14:46:35.626 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588920 podman[261349]: 2026-01-20 14:46:35.990931471 +0000 UTC m=+0.078791887 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:46:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4114297959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:36 np0005588920 nova_compute[226886]: 2026-01-20 14:46:36.018 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:36 np0005588920 nova_compute[226886]: 2026-01-20 14:46:36.023 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:46:36 np0005588920 nova_compute[226886]: 2026-01-20 14:46:36.098 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:46:36 np0005588920 nova_compute[226886]: 2026-01-20 14:46:36.138 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:46:36 np0005588920 nova_compute[226886]: 2026-01-20 14:46:36.139 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:36.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.068 226890 INFO nova.compute.manager [None req-3eff38f9-74b6-4954-a535-1c5c88a0d6e4 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Unpausing#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.070 226890 DEBUG nova.objects.instance [None req-3eff38f9-74b6-4954-a535-1c5c88a0d6e4 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.096 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920397.0958302, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.096 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:46:37 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.100 226890 DEBUG nova.virt.libvirt.guest [None req-3eff38f9-74b6-4954-a535-1c5c88a0d6e4 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.101 226890 DEBUG nova.compute.manager [None req-3eff38f9-74b6-4954-a535-1c5c88a0d6e4 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.121 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.124 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.139 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.140 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.140 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.146 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 20 09:46:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:37.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:37 np0005588920 nova_compute[226886]: 2026-01-20 14:46:37.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:38.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:38 np0005588920 nova_compute[226886]: 2026-01-20 14:46:38.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:39 np0005588920 nova_compute[226886]: 2026-01-20 14:46:39.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:39.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:46:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:40.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:46:40 np0005588920 nova_compute[226886]: 2026-01-20 14:46:40.628 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:40 np0005588920 nova_compute[226886]: 2026-01-20 14:46:40.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2051009862' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:41 np0005588920 nova_compute[226886]: 2026-01-20 14:46:41.489 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:41 np0005588920 nova_compute[226886]: 2026-01-20 14:46:41.489 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:41 np0005588920 nova_compute[226886]: 2026-01-20 14:46:41.490 226890 DEBUG nova.objects.instance [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:41.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:41 np0005588920 nova_compute[226886]: 2026-01-20 14:46:41.914 226890 DEBUG nova.objects.instance [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'pci_requests' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:41 np0005588920 nova_compute[226886]: 2026-01-20 14:46:41.931 226890 DEBUG nova.network.neutron [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:46:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:42 np0005588920 nova_compute[226886]: 2026-01-20 14:46:42.374 226890 DEBUG nova.policy [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5cd9508688214bedb977528f8b6f95d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7bbf722f17654404925cfb53e48cd473', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:46:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:42.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:43 np0005588920 nova_compute[226886]: 2026-01-20 14:46:43.346 226890 DEBUG nova.network.neutron [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Successfully created port: 67e337c5-f5f0-4e50-8fd6-77a02fb05273 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:46:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:43.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.158 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.310 226890 DEBUG nova.network.neutron [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Successfully updated port: 67e337c5-f5f0-4e50-8fd6-77a02fb05273 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.352 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.353 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.353 226890 DEBUG nova.network.neutron [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:46:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:44.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.406 226890 DEBUG nova.compute.manager [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-changed-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.407 226890 DEBUG nova.compute.manager [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing instance network info cache due to event network-changed-67e337c5-f5f0-4e50-8fd6-77a02fb05273. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:46:44 np0005588920 nova_compute[226886]: 2026-01-20 14:46:44.407 226890 DEBUG oslo_concurrency.lockutils [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:45.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:45 np0005588920 nova_compute[226886]: 2026-01-20 14:46:45.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:46.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:46 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:46Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:46:46 np0005588920 podman[261377]: 2026-01-20 14:46:46.963959655 +0000 UTC m=+0.049390627 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.134 226890 DEBUG nova.network.neutron [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.152 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.153 226890 DEBUG oslo_concurrency.lockutils [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.153 226890 DEBUG nova.network.neutron [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Refreshing network info cache for port 67e337c5-f5f0-4e50-8fd6-77a02fb05273 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.156 226890 DEBUG nova.virt.libvirt.vif [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.156 226890 DEBUG nova.network.os_vif_util [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.157 226890 DEBUG nova.network.os_vif_util [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.157 226890 DEBUG os_vif [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.158 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.158 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.161 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67e337c5-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.162 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap67e337c5-f5, col_values=(('external_ids', {'iface-id': '67e337c5-f5f0-4e50-8fd6-77a02fb05273', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:2a:53', 'vm-uuid': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.1644] manager: (tap67e337c5-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.165 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.172 226890 INFO os_vif [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5')#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.173 226890 DEBUG nova.virt.libvirt.vif [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.173 226890 DEBUG nova.network.os_vif_util [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.174 226890 DEBUG nova.network.os_vif_util [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.176 226890 DEBUG nova.virt.libvirt.guest [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:14:2a:53"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <target dev="tap67e337c5-f5"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:46:47 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.1878] manager: (tap67e337c5-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 20 09:46:47 np0005588920 kernel: tap67e337c5-f5: entered promiscuous mode
Jan 20 09:46:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:47Z|00392|binding|INFO|Claiming lport 67e337c5-f5f0-4e50-8fd6-77a02fb05273 for this chassis.
Jan 20 09:46:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:47Z|00393|binding|INFO|67e337c5-f5f0-4e50-8fd6-77a02fb05273: Claiming fa:16:3e:14:2a:53 10.10.10.179
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.189 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.200 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:2a:53 10.10.10.179'], port_security=['fa:16:3e:14:2a:53 10.10.10.179'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.10.179/24', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-224f915a-f1b3-471e-87e9-97b33406d6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'de73e42c-fe22-4450-87c1-9d723f334d4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dacd89c2-dc9a-4d7b-aa39-3acd1e4a96f2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=67e337c5-f5f0-4e50-8fd6-77a02fb05273) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.202 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 67e337c5-f5f0-4e50-8fd6-77a02fb05273 in datapath 224f915a-f1b3-471e-87e9-97b33406d6fd bound to our chassis#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.203 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 224f915a-f1b3-471e-87e9-97b33406d6fd#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.216 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6cb88e-e041-4653-9d75-0b67ed20e609]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.217 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap224f915a-f1 in ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.220 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap224f915a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.220 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bf179f33-c657-491a-ba4c-d8e27178f3cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.221 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3df419ff-05f0-461c-bf52-7e871433b693]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 systemd-udevd[261404]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.234 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[007234c3-3e18-45f0-914c-6671ec2828c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.243 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:47Z|00394|binding|INFO|Setting lport 67e337c5-f5f0-4e50-8fd6-77a02fb05273 ovn-installed in OVS
Jan 20 09:46:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:47Z|00395|binding|INFO|Setting lport 67e337c5-f5f0-4e50-8fd6-77a02fb05273 up in Southbound
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.246 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.2529] device (tap67e337c5-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.2536] device (tap67e337c5-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.259 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3728204b-cceb-4354-a8c4-a4d656257622]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.289 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2e21d5-b245-4f25-9131-a08b31d5e5cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.2955] manager: (tap224f915a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.295 226890 DEBUG nova.virt.libvirt.driver [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:47 np0005588920 systemd-udevd[261407]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.295 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aa22deef-2360-455c-970b-b3b8cff6db36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.296 226890 DEBUG nova.virt.libvirt.driver [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.296 226890 DEBUG nova.virt.libvirt.driver [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No VIF found with MAC fa:16:3e:80:88:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.326 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9abcdd32-e288-43fb-a63a-0949227020ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.328 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[71713425-ffb6-453b-a80f-828e04525a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.328 226890 DEBUG nova.virt.libvirt.guest [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:47</nova:creationTime>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:47 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    <nova:port uuid="67e337c5-f5f0-4e50-8fd6-77a02fb05273">
Jan 20 09:46:47 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.10.10.179" ipVersion="4"/>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:47 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:47 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:47 np0005588920 nova_compute[226886]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.3484] device (tap224f915a-f0): carrier: link connected
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.354 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6241a5e2-8823-4bf8-8654-02ecc03f38c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.360 226890 DEBUG oslo_concurrency.lockutils [None req-6bd57b2e-8e2e-428b-978c-b5dc548e7b22 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.373 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf4bd59-171d-4452-923a-d69ca7c565ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap224f915a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:4e:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544903, 'reachable_time': 24680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261429, 'error': None, 'target': 'ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.389 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc05c31-77c6-466c-a367-b53d7d659505]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:4e00'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544903, 'tstamp': 544903}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261430, 'error': None, 'target': 'ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.409 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8dbe82-e978-4608-b2eb-074df2eaf2e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap224f915a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:4e:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544903, 'reachable_time': 24680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261431, 'error': None, 'target': 'ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.442 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[010bc511-fc9a-4fbc-a38f-c35aecddac46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.508 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9582c792-3df6-4db4-9b06-68de88aa606e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.510 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap224f915a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.510 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.511 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap224f915a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 NetworkManager[49076]: <info>  [1768920407.5135] manager: (tap224f915a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 20 09:46:47 np0005588920 kernel: tap224f915a-f0: entered promiscuous mode
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.517 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap224f915a-f0, col_values=(('external_ids', {'iface-id': 'de0398b4-2d24-42f0-ba2f-4c198fbff906'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:47Z|00396|binding|INFO|Releasing lport de0398b4-2d24-42f0-ba2f-4c198fbff906 from this chassis (sb_readonly=0)
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.520 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/224f915a-f1b3-471e-87e9-97b33406d6fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/224f915a-f1b3-471e-87e9-97b33406d6fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.521 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd041eb-0e13-4adc-b210-dcb508fa8e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.522 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-224f915a-f1b3-471e-87e9-97b33406d6fd
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/224f915a-f1b3-471e-87e9-97b33406d6fd.pid.haproxy
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 224f915a-f1b3-471e-87e9-97b33406d6fd
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:46:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:47.523 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd', 'env', 'PROCESS_TAG=haproxy-224f915a-f1b3-471e-87e9-97b33406d6fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/224f915a-f1b3-471e-87e9-97b33406d6fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:47.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.779 226890 DEBUG nova.compute.manager [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.779 226890 DEBUG oslo_concurrency.lockutils [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.780 226890 DEBUG oslo_concurrency.lockutils [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.780 226890 DEBUG oslo_concurrency.lockutils [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.780 226890 DEBUG nova.compute.manager [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:47 np0005588920 nova_compute[226886]: 2026-01-20 14:46:47.780 226890 WARNING nova.compute.manager [req-10b0266b-5725-4601-bdfa-39c29bc2fc60 req-5650b405-0b37-4d9e-b35b-7d8ddaad0013 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:47 np0005588920 podman[261464]: 2026-01-20 14:46:47.880047013 +0000 UTC m=+0.026809059 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.041 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.042 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.065 226890 DEBUG nova.objects.instance [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.114 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.368 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.369 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.369 226890 INFO nova.compute.manager [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Attaching volume 42ddc596-73e2-4763-a6a5-75e17ba882ad to /dev/vdb#033[00m
Jan 20 09:46:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:48.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.540 226890 DEBUG os_brick.utils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.542 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.562 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.563 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[d4838726-82f9-4d41-adc0-914d5409b29e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.564 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.577 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.578 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[880c5cfd-2ce9-4cfc-b3ad-cf2aed66c37f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.579 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:48 np0005588920 podman[261464]: 2026-01-20 14:46:48.586220018 +0000 UTC m=+0.732982044 container create cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.589 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.589 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[50c3096a-15fa-4fb4-a951-f0cb40359a87]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.590 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5e2332-4cdb-4924-bb1d-fe4d498c52a8]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.591 226890 DEBUG oslo_concurrency.processutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.612 226890 DEBUG oslo_concurrency.processutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.614 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.615 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.615 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.615 226890 DEBUG os_brick.utils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] <== get_connector_properties: return (75ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.616 226890 DEBUG nova.virt.block_device [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating existing volume attachment record: 6619e757-e35b-4400-88fd-ee32e8d659d5 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.645 226890 DEBUG nova.network.neutron [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updated VIF entry in instance network info cache for port 67e337c5-f5f0-4e50-8fd6-77a02fb05273. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.645 226890 DEBUG nova.network.neutron [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:48 np0005588920 nova_compute[226886]: 2026-01-20 14:46:48.667 226890 DEBUG oslo_concurrency.lockutils [req-367868be-3fd5-4baf-ab6a-18673c3d88da req-10fcc128-6d40-4a9f-9e3c-bc544503d7aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:48 np0005588920 systemd[1]: Started libpod-conmon-cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0.scope.
Jan 20 09:46:48 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:46:48 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4523ddceeee5ed78d82434632c00af7e4213b152bd5e4ed5f8d5a9e031533f21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:46:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:48Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:2a:53 10.10.10.179
Jan 20 09:46:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:48Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:2a:53 10.10.10.179
Jan 20 09:46:49 np0005588920 podman[261464]: 2026-01-20 14:46:49.082757509 +0000 UTC m=+1.229519566 container init cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:46:49 np0005588920 podman[261464]: 2026-01-20 14:46:49.087743019 +0000 UTC m=+1.234505045 container start cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:46:49 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [NOTICE]   (261491) : New worker (261493) forked
Jan 20 09:46:49 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [NOTICE]   (261491) : Loading success.
Jan 20 09:46:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2897975098' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.392 226890 DEBUG nova.objects.instance [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.416 226890 DEBUG nova.virt.libvirt.driver [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Attempting to attach volume 42ddc596-73e2-4763-a6a5-75e17ba882ad with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.418 226890 DEBUG nova.virt.libvirt.guest [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-42ddc596-73e2-4763-a6a5-75e17ba882ad">
Jan 20 09:46:49 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 09:46:49 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  </auth>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:49 np0005588920 nova_compute[226886]:  <serial>42ddc596-73e2-4763-a6a5-75e17ba882ad</serial>
Jan 20 09:46:49 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:46:49 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.549 226890 DEBUG nova.virt.libvirt.driver [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.549 226890 DEBUG nova.virt.libvirt.driver [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.550 226890 DEBUG nova.virt.libvirt.driver [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] No VIF found with MAC fa:16:3e:80:88:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:46:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:49.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.787 226890 DEBUG oslo_concurrency.lockutils [None req-c2313525-d8a5-4bef-890e-76ecf92f37e9 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.887 226890 DEBUG nova.compute.manager [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.888 226890 DEBUG oslo_concurrency.lockutils [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.888 226890 DEBUG oslo_concurrency.lockutils [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.888 226890 DEBUG oslo_concurrency.lockutils [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.889 226890 DEBUG nova.compute.manager [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:49 np0005588920 nova_compute[226886]: 2026-01-20 14:46:49.889 226890 WARNING nova.compute.manager [req-b55b7f85-a533-4f7a-91af-53997c10f4a1 req-b6fd0e62-5f61-4b7d-a502-783f01595090 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:50.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:50 np0005588920 nova_compute[226886]: 2026-01-20 14:46:50.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:50.989 144287 DEBUG eventlet.wsgi.server [-] (144287) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:50.990 144287 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: Accept: */*#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: Connection: close#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: Content-Type: text/plain#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: Host: 169.254.169.254#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: User-Agent: curl/7.84.0#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: X-Forwarded-For: 10.100.0.9#015
Jan 20 09:46:50 np0005588920 ovn_metadata_agent[144123]: X-Ovn-Network-Id: c59c8bba-9fc6-441e-8b7d-cd5444901b2a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 20 09:46:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:51.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:52.341 144287 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 20 09:46:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:52.341 144287 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1918 time: 1.3511534#033[00m
Jan 20 09:46:52 np0005588920 haproxy-metadata-proxy-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[261001]: 10.100.0.9:53322 [20/Jan/2026:14:46:50.988] listener listener/metadata 0/0/0/1353/1353 200 1902 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 20 09:46:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:52.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.834 226890 DEBUG oslo_concurrency.lockutils [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.835 226890 DEBUG oslo_concurrency.lockutils [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.850 226890 INFO nova.compute.manager [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Detaching volume 42ddc596-73e2-4763-a6a5-75e17ba882ad#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.956 226890 INFO nova.virt.block_device [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Attempting to driver detach volume 42ddc596-73e2-4763-a6a5-75e17ba882ad from mountpoint /dev/vdb#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.968 226890 DEBUG nova.virt.libvirt.driver [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Attempting to detach device vdb from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.969 226890 DEBUG nova.virt.libvirt.guest [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-42ddc596-73e2-4763-a6a5-75e17ba882ad">
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <serial>42ddc596-73e2-4763-a6a5-75e17ba882ad</serial>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:46:52 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.977 226890 INFO nova.virt.libvirt.driver [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully detached device vdb from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the persistent domain config.#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.977 226890 DEBUG nova.virt.libvirt.driver [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:46:52 np0005588920 nova_compute[226886]: 2026-01-20 14:46:52.978 226890 DEBUG nova.virt.libvirt.guest [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-42ddc596-73e2-4763-a6a5-75e17ba882ad">
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <serial>42ddc596-73e2-4763-a6a5-75e17ba882ad</serial>
Jan 20 09:46:52 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:46:52 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:46:52 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:53 np0005588920 nova_compute[226886]: 2026-01-20 14:46:53.103 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768920413.1024888, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:46:53 np0005588920 nova_compute[226886]: 2026-01-20 14:46:53.106 226890 DEBUG nova.virt.libvirt.driver [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:46:53 np0005588920 nova_compute[226886]: 2026-01-20 14:46:53.108 226890 INFO nova.virt.libvirt.driver [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully detached device vdb from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the live domain config.#033[00m
Jan 20 09:46:53 np0005588920 nova_compute[226886]: 2026-01-20 14:46:53.290 226890 DEBUG nova.objects.instance [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:53 np0005588920 nova_compute[226886]: 2026-01-20 14:46:53.371 226890 DEBUG oslo_concurrency.lockutils [None req-96aef5c2-afc1-4bbe-ba70-2878e6f81532 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:53.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:54.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.668 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-67e337c5-f5f0-4e50-8fd6-77a02fb05273" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.669 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-67e337c5-f5f0-4e50-8fd6-77a02fb05273" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.694 226890 DEBUG nova.objects.instance [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.732 226890 DEBUG nova.virt.libvirt.vif [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=InstanceDeviceMetadata,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.733 226890 DEBUG nova.network.os_vif_util [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.733 226890 DEBUG nova.network.os_vif_util [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.737 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.740 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.743 226890 DEBUG nova.virt.libvirt.driver [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Attempting to detach device tap67e337c5-f5 from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.744 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:14:2a:53"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <target dev="tap67e337c5-f5"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.752 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.756 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface>not found in domain: <domain type='kvm' id='42'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <name>instance-00000061</name>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <uuid>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</uuid>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:47</nova:creationTime>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:port uuid="67e337c5-f5f0-4e50-8fd6-77a02fb05273">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.10.10.179" ipVersion="4"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='serial'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='uuid'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk' index='2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config' index='1'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:80:88:4e'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='tapfc4a2805-d7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:14:2a:53'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='tap67e337c5-f5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='net1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/1'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c12,c164</label>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c12,c164</imagelabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.757 226890 INFO nova.virt.libvirt.driver [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully detached device tap67e337c5-f5 from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the persistent domain config.#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.757 226890 DEBUG nova.virt.libvirt.driver [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] (1/8): Attempting to detach device tap67e337c5-f5 with device alias net1 from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.757 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <mac address="fa:16:3e:14:2a:53"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <model type="virtio"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <mtu size="1442"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <target dev="tap67e337c5-f5"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </interface>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:54 np0005588920 kernel: tap67e337c5-f5 (unregistering): left promiscuous mode
Jan 20 09:46:54 np0005588920 NetworkManager[49076]: <info>  [1768920414.8669] device (tap67e337c5-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.872 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:54Z|00397|binding|INFO|Releasing lport 67e337c5-f5f0-4e50-8fd6-77a02fb05273 from this chassis (sb_readonly=0)
Jan 20 09:46:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:54Z|00398|binding|INFO|Setting lport 67e337c5-f5f0-4e50-8fd6-77a02fb05273 down in Southbound
Jan 20 09:46:54 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:54Z|00399|binding|INFO|Removing iface tap67e337c5-f5 ovn-installed in OVS
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.874 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:54.878 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:2a:53 10.10.10.179'], port_security=['fa:16:3e:14:2a:53 10.10.10.179'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.10.179/24', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-224f915a-f1b3-471e-87e9-97b33406d6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'de73e42c-fe22-4450-87c1-9d723f334d4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dacd89c2-dc9a-4d7b-aa39-3acd1e4a96f2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=67e337c5-f5f0-4e50-8fd6-77a02fb05273) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:54.880 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 67e337c5-f5f0-4e50-8fd6-77a02fb05273 in datapath 224f915a-f1b3-471e-87e9-97b33406d6fd unbound from our chassis#033[00m
Jan 20 09:46:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:54.881 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 224f915a-f1b3-471e-87e9-97b33406d6fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.882 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768920414.8824532, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:46:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:54.882 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[11c2c255-8a85-4f57-9bca-324e6fe626e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:54.883 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd namespace which is not needed anymore#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.884 226890 DEBUG nova.virt.libvirt.driver [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Start waiting for the detach event from libvirt for device tap67e337c5-f5 with device alias net1 for instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.884 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.887 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface>not found in domain: <domain type='kvm' id='42'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <name>instance-00000061</name>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <uuid>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</uuid>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:47</nova:creationTime>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:port uuid="67e337c5-f5f0-4e50-8fd6-77a02fb05273">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.10.10.179" ipVersion="4"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='serial'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='uuid'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk' index='2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config' index='1'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:80:88:4e'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target dev='tapfc4a2805-d7'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/1'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c12,c164</label>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c12,c164</imagelabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.888 226890 INFO nova.virt.libvirt.driver [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully detached device tap67e337c5-f5 from instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 from the live domain config.#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.889 226890 DEBUG nova.virt.libvirt.vif [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=InstanceDeviceMetadata,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.889 226890 DEBUG nova.network.os_vif_util [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.890 226890 DEBUG nova.network.os_vif_util [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.890 226890 DEBUG os_vif [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.893 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67e337c5-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.894 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.897 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.898 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.901 226890 INFO os_vif [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5')#033[00m
Jan 20 09:46:54 np0005588920 nova_compute[226886]: 2026-01-20 14:46:54.901 226890 DEBUG nova.virt.libvirt.guest [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:54</nova:creationTime>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:54 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:54 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:54 np0005588920 nova_compute[226886]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:46:55 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [NOTICE]   (261491) : haproxy version is 2.8.14-c23fe91
Jan 20 09:46:55 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [NOTICE]   (261491) : path to executable is /usr/sbin/haproxy
Jan 20 09:46:55 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [WARNING]  (261491) : Exiting Master process...
Jan 20 09:46:55 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [ALERT]    (261491) : Current worker (261493) exited with code 143 (Terminated)
Jan 20 09:46:55 np0005588920 neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd[261487]: [WARNING]  (261491) : All workers exited. Exiting... (0)
Jan 20 09:46:55 np0005588920 systemd[1]: libpod-cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0.scope: Deactivated successfully.
Jan 20 09:46:55 np0005588920 conmon[261487]: conmon cd4962dfe3f2e73bb2c1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0.scope/container/memory.events
Jan 20 09:46:55 np0005588920 podman[261548]: 2026-01-20 14:46:55.013988918 +0000 UTC m=+0.044485082 container died cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:46:55 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0-userdata-shm.mount: Deactivated successfully.
Jan 20 09:46:55 np0005588920 systemd[1]: var-lib-containers-storage-overlay-4523ddceeee5ed78d82434632c00af7e4213b152bd5e4ed5f8d5a9e031533f21-merged.mount: Deactivated successfully.
Jan 20 09:46:55 np0005588920 podman[261548]: 2026-01-20 14:46:55.051697339 +0000 UTC m=+0.082193493 container cleanup cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:46:55 np0005588920 systemd[1]: libpod-conmon-cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0.scope: Deactivated successfully.
Jan 20 09:46:55 np0005588920 podman[261580]: 2026-01-20 14:46:55.122331848 +0000 UTC m=+0.046469477 container remove cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.128 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68dbcd3f-90c1-481d-8a27-12febfd11378]: (4, ('Tue Jan 20 02:46:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd (cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0)\ncd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0\nTue Jan 20 02:46:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd (cd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0)\ncd4962dfe3f2e73bb2c1c7b7ba2c79fb86e68c92a783a70e5f070241d9c9b0c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.130 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[08562df9-520e-4897-94b1-5ecae3948aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.131 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap224f915a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:55 np0005588920 kernel: tap224f915a-f0: left promiscuous mode
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.146 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.148 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[74ef6ea3-9f48-45c6-8b27-15232dd587da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.170 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[722e3929-502a-4fc3-b9f0-aac0b498899b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.171 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcb9894-3020-44a2-a526-bd2798f7e636]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.186 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[25f89bb7-d59c-4d36-a473-cabd1ccf7710]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544896, 'reachable_time': 16716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261594, 'error': None, 'target': 'ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.189 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-224f915a-f1b3-471e-87e9-97b33406d6fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:46:55 np0005588920 systemd[1]: run-netns-ovnmeta\x2d224f915a\x2df1b3\x2d471e\x2d87e9\x2d97b33406d6fd.mount: Deactivated successfully.
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.189 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[151c51d2-e396-408d-9e09-ff9a0b07d037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.336 226890 DEBUG nova.compute.manager [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-unplugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.337 226890 DEBUG oslo_concurrency.lockutils [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.337 226890 DEBUG oslo_concurrency.lockutils [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.337 226890 DEBUG oslo_concurrency.lockutils [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.337 226890 DEBUG nova.compute.manager [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-unplugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.338 226890 WARNING nova.compute.manager [req-a783b15c-d62b-4a9c-abad-26d1d78638c6 req-ed007704-748d-4d96-bb42-32dc0f4153ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-unplugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.461 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.461 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:55.463 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:46:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:55.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:55 np0005588920 nova_compute[226886]: 2026-01-20 14:46:55.637 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:56 np0005588920 nova_compute[226886]: 2026-01-20 14:46:56.341 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:56 np0005588920 nova_compute[226886]: 2026-01-20 14:46:56.342 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquired lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:56 np0005588920 nova_compute[226886]: 2026-01-20 14:46:56.342 226890 DEBUG nova.network.neutron [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:46:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:56.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.433 226890 DEBUG nova.compute.manager [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.434 226890 DEBUG oslo_concurrency.lockutils [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.434 226890 DEBUG oslo_concurrency.lockutils [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 DEBUG oslo_concurrency.lockutils [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 DEBUG nova.compute.manager [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 WARNING nova.compute.manager [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-67e337c5-f5f0-4e50-8fd6-77a02fb05273 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 DEBUG nova.compute.manager [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-deleted-67e337c5-f5f0-4e50-8fd6-77a02fb05273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 INFO nova.compute.manager [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Neutron deleted interface 67e337c5-f5f0-4e50-8fd6-77a02fb05273; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.435 226890 DEBUG nova.network.neutron [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.454 226890 DEBUG nova.objects.instance [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'system_metadata' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.479 226890 DEBUG nova.objects.instance [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'flavor' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.508 226890 DEBUG nova.virt.libvirt.vif [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.508 226890 DEBUG nova.network.os_vif_util [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.509 226890 DEBUG nova.network.os_vif_util [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.514 226890 DEBUG nova.virt.libvirt.guest [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.520 226890 DEBUG nova.virt.libvirt.guest [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface>not found in domain: <domain type='kvm' id='42'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <name>instance-00000061</name>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <uuid>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</uuid>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:54</nova:creationTime>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='serial'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='uuid'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk' index='2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config' index='1'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:80:88:4e'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='tapfc4a2805-d7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/1'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c12,c164</label>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c12,c164</imagelabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.521 226890 DEBUG nova.virt.libvirt.guest [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.524 226890 DEBUG nova.virt.libvirt.guest [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:2a:53"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap67e337c5-f5"/></interface>not found in domain: <domain type='kvm' id='42'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <name>instance-00000061</name>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <uuid>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</uuid>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:54</nova:creationTime>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <memory unit='KiB'>131072</memory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <resource>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <partition>/machine</partition>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </resource>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <sysinfo type='smbios'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='serial'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='uuid'>6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <boot dev='hd'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <smbios mode='sysinfo'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <vmcoreinfo state='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='x2apic'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <feature policy='require' name='vme'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <clock offset='utc'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <timer name='hpet' present='no'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_reboot>restart</on_reboot>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <on_crash>destroy</on_crash>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <disk type='network' device='disk'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk' index='2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='vda' bus='virtio'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='virtio-disk0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <disk type='network' device='cdrom'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <auth username='openstack'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source protocol='rbd' name='vms/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_disk.config' index='1'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='sda' bus='sata'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <readonly/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='sata0-0-0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pcie.0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='1' port='0x10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='2' port='0x11'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='3' port='0x12'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='4' port='0x13'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='5' port='0x14'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='6' port='0x15'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='7' port='0x16'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='8' port='0x17'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.8'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='9' port='0x18'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.9'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='10' port='0x19'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='11' port='0x1a'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.11'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='12' port='0x1b'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.12'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='13' port='0x1c'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.13'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='14' port='0x1d'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.14'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='15' port='0x1e'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.15'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='16' port='0x1f'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.16'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='17' port='0x20'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.17'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='18' port='0x21'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.18'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='19' port='0x22'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.19'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='20' port='0x23'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.20'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='21' port='0x24'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.21'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='22' port='0x25'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.22'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='23' port='0x26'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.23'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='24' port='0x27'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.24'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-root-port'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target chassis='25' port='0x28'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.25'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model name='pcie-pci-bridge'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='pci.26'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='usb'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <controller type='sata' index='0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='ide'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </controller>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <interface type='ethernet'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <mac address='fa:16:3e:80:88:4e'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target dev='tapfc4a2805-d7'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model type='virtio'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <mtu size='1442'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='net0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <serial type='pty'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target type='isa-serial' port='0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:        <model name='isa-serial'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      </target>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <console type='pty' tty='/dev/pts/1'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <source path='/dev/pts/1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <log file='/var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835/console.log' append='off'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <target type='serial' port='0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='serial0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </console>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='tablet' bus='usb'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='mouse' bus='ps2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input1'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <input type='keyboard' bus='ps2'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='input2'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </input>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <listen type='address' address='::0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </graphics>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <audio id='1' type='none'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='video0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <watchdog model='itco' action='reset'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='watchdog0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </watchdog>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <memballoon model='virtio'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <stats period='10'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='balloon0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <rng model='virtio'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <alias name='rng0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <label>system_u:system_r:svirt_t:s0:c12,c164</label>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c12,c164</imagelabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <label>+107:+107</label>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </seclabel>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.525 226890 WARNING nova.virt.libvirt.driver [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Detaching interface fa:16:3e:14:2a:53 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap67e337c5-f5' not found.#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.526 226890 DEBUG nova.virt.libvirt.vif [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.526 226890 DEBUG nova.network.os_vif_util [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "address": "fa:16:3e:14:2a:53", "network": {"id": "224f915a-f1b3-471e-87e9-97b33406d6fd", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-1970629831", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap67e337c5-f5", "ovs_interfaceid": "67e337c5-f5f0-4e50-8fd6-77a02fb05273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.527 226890 DEBUG nova.network.os_vif_util [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.527 226890 DEBUG os_vif [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.529 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67e337c5-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.529 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.531 226890 INFO os_vif [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:2a:53,bridge_name='br-int',has_traffic_filtering=True,id=67e337c5-f5f0-4e50-8fd6-77a02fb05273,network=Network(224f915a-f1b3-471e-87e9-97b33406d6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap67e337c5-f5')#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.532 226890 DEBUG nova.virt.libvirt.guest [req-3c41b281-4505-4896-bfb4-808f08e0ef82 req-a9b4b291-71fd-46e6-835c-4854d24a037c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:name>tempest-device-tagging-server-1302621797</nova:name>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:creationTime>2026-01-20 14:46:57</nova:creationTime>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:flavor name="m1.nano">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:memory>128</nova:memory>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:disk>1</nova:disk>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:swap>0</nova:swap>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:flavor>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:user uuid="5cd9508688214bedb977528f8b6f95d1">tempest-TaggedAttachmentsTest-1005708480-project-member</nova:user>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:project uuid="7bbf722f17654404925cfb53e48cd473">tempest-TaggedAttachmentsTest-1005708480</nova:project>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:owner>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  <nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    <nova:port uuid="fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1">
Jan 20 09:46:57 np0005588920 nova_compute[226886]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:    </nova:port>
Jan 20 09:46:57 np0005588920 nova_compute[226886]:  </nova:ports>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: </nova:instance>
Jan 20 09:46:57 np0005588920 nova_compute[226886]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:46:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:57.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.990 226890 INFO nova.network.neutron [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Port 67e337c5-f5f0-4e50-8fd6-77a02fb05273 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 20 09:46:57 np0005588920 nova_compute[226886]: 2026-01-20 14:46:57.991 226890 DEBUG nova.network.neutron [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [{"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:58 np0005588920 nova_compute[226886]: 2026-01-20 14:46:58.017 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Releasing lock "refresh_cache-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:58 np0005588920 nova_compute[226886]: 2026-01-20 14:46:58.042 226890 DEBUG oslo_concurrency.lockutils [None req-70bf110c-cb43-4482-9173-b96bc09dd0e6 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "interface-6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-67e337c5-f5f0-4e50-8fd6-77a02fb05273" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:58.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.245 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.246 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.246 226890 INFO nova.compute.manager [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Rebooting instance#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.267 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.267 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.268 226890 DEBUG nova.network.neutron [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.557 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.558 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.558 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.558 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.559 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.560 226890 INFO nova.compute.manager [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Terminating instance#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.561 226890 DEBUG nova.compute.manager [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:46:59 np0005588920 kernel: tapfc4a2805-d7 (unregistering): left promiscuous mode
Jan 20 09:46:59 np0005588920 NetworkManager[49076]: <info>  [1768920419.6075] device (tapfc4a2805-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:46:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:59.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00400|binding|INFO|Releasing lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 from this chassis (sb_readonly=0)
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00401|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 down in Southbound
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.671 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00402|binding|INFO|Removing iface tapfc4a2805-d7 ovn-installed in OVS
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.679 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:4e 10.100.0.9'], port_security=['fa:16:3e:80:88:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f09647-0a97-406a-bb20-4a478cae9ceb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=becf0890-2062-451f-a3f9-626953c24d96, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.680 144128 INFO neutron.agent.ovn.metadata.agent [-] Port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 in datapath c59c8bba-9fc6-441e-8b7d-cd5444901b2a unbound from our chassis#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.682 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c59c8bba-9fc6-441e-8b7d-cd5444901b2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.683 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ee157563-1f87-4d01-9c45-9f1e9f7c4586]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.683 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a namespace which is not needed anymore#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Deactivated successfully.
Jan 20 09:46:59 np0005588920 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000061.scope: Consumed 14.345s CPU time.
Jan 20 09:46:59 np0005588920 systemd-machined[196121]: Machine qemu-42-instance-00000061 terminated.
Jan 20 09:46:59 np0005588920 kernel: tapfc4a2805-d7: entered promiscuous mode
Jan 20 09:46:59 np0005588920 kernel: tapfc4a2805-d7 (unregistering): left promiscuous mode
Jan 20 09:46:59 np0005588920 NetworkManager[49076]: <info>  [1768920419.7785] manager: (tapfc4a2805-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00403|binding|INFO|Claiming lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for this chassis.
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00404|binding|INFO|fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1: Claiming fa:16:3e:80:88:4e 10.100.0.9
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.793 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:4e 10.100.0.9'], port_security=['fa:16:3e:80:88:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f09647-0a97-406a-bb20-4a478cae9ceb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=becf0890-2062-451f-a3f9-626953c24d96, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.795 226890 INFO nova.virt.libvirt.driver [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Instance destroyed successfully.#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.796 226890 DEBUG nova.objects.instance [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lazy-loading 'resources' on Instance uuid 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00405|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 ovn-installed in OVS
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00406|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 up in Southbound
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00407|binding|INFO|Releasing lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 from this chassis (sb_readonly=1)
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.803 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00408|if_status|INFO|Dropped 9 log messages in last 406 seconds (most recently, 406 seconds ago) due to excessive rate
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00409|if_status|INFO|Not setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 down as sb is readonly
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00410|binding|INFO|Removing iface tapfc4a2805-d7 ovn-installed in OVS
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.806 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00411|binding|INFO|Releasing lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 from this chassis (sb_readonly=0)
Jan 20 09:46:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:46:59Z|00412|binding|INFO|Setting lport fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 down in Southbound
Jan 20 09:46:59 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [NOTICE]   (260999) : haproxy version is 2.8.14-c23fe91
Jan 20 09:46:59 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [NOTICE]   (260999) : path to executable is /usr/sbin/haproxy
Jan 20 09:46:59 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [WARNING]  (260999) : Exiting Master process...
Jan 20 09:46:59 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [ALERT]    (260999) : Current worker (261001) exited with code 143 (Terminated)
Jan 20 09:46:59 np0005588920 neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a[260995]: [WARNING]  (260999) : All workers exited. Exiting... (0)
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.813 226890 DEBUG nova.virt.libvirt.vif [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-1302621797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-1302621797',id=97,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOduM1b5U0PoRBxfWNx/s8WugTbFKTiIFgBHu9L46ctJZjGe+8jT4Yj1g5uDYe7bVMyBzUuTiVef/RKuMuEZx2XkjNuYbN4taGM1Lc0bPehwylAtyxjPpZUim+i9YC026A==',key_name='tempest-keypair-1946074746',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bbf722f17654404925cfb53e48cd473',ramdisk_id='',reservation_id='r-d41vvz1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedAttachmentsTest-1005708480',owner_user_name='tempest-TaggedAttachmentsTest-1005708480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5cd9508688214bedb977528f8b6f95d1',uuid=6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.814 226890 DEBUG nova.network.os_vif_util [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converting VIF {"id": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "address": "fa:16:3e:80:88:4e", "network": {"id": "c59c8bba-9fc6-441e-8b7d-cd5444901b2a", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-680871933-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bbf722f17654404925cfb53e48cd473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc4a2805-d7", "ovs_interfaceid": "fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.814 226890 DEBUG nova.network.os_vif_util [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.815 226890 DEBUG os_vif [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.816 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.816 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc4a2805-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:59 np0005588920 systemd[1]: libpod-20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866.scope: Deactivated successfully.
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.818 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:88:4e 10.100.0.9'], port_security=['fa:16:3e:80:88:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bbf722f17654404925cfb53e48cd473', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f09647-0a97-406a-bb20-4a478cae9ceb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=becf0890-2062-451f-a3f9-626953c24d96, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.822 226890 INFO os_vif [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:88:4e,bridge_name='br-int',has_traffic_filtering=True,id=fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1,network=Network(c59c8bba-9fc6-441e-8b7d-cd5444901b2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc4a2805-d7')#033[00m
Jan 20 09:46:59 np0005588920 podman[261619]: 2026-01-20 14:46:59.823428646 +0000 UTC m=+0.056989029 container died 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:46:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866-userdata-shm.mount: Deactivated successfully.
Jan 20 09:46:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay-7cd0c84eb320478cd48992671f02f9d7a597064f617d158a39e212c251bbcaba-merged.mount: Deactivated successfully.
Jan 20 09:46:59 np0005588920 podman[261619]: 2026-01-20 14:46:59.858053132 +0000 UTC m=+0.091613495 container cleanup 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:46:59 np0005588920 systemd[1]: libpod-conmon-20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866.scope: Deactivated successfully.
Jan 20 09:46:59 np0005588920 podman[261669]: 2026-01-20 14:46:59.922840078 +0000 UTC m=+0.041959281 container remove 20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.928 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9876d1c0-d326-4a03-830c-078dd5ed7cd9]: (4, ('Tue Jan 20 02:46:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a (20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866)\n20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866\nTue Jan 20 02:46:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a (20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866)\n20ad94eb1bba186f5acc2ddd5685c4c1f2deeb54ca503942417d552d6f52d866\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.930 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d308658e-1497-4232-be15-730347b0ac4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.931 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc59c8bba-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.932 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 kernel: tapc59c8bba-90: left promiscuous mode
Jan 20 09:46:59 np0005588920 nova_compute[226886]: 2026-01-20 14:46:59.946 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.949 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7bb49f-dd3b-4405-97d8-b19e6da72a3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.962 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dca6acf6-b50e-4c3a-be4c-d8a36dcf8170]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.963 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4e72c6f7-9162-4f33-8922-f17f8b483ff0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.977 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b36f2e-d7a3-49ff-a224-9ffada0cf308]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542293, 'reachable_time': 16380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261685, 'error': None, 'target': 'ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.979 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c59c8bba-9fc6-441e-8b7d-cd5444901b2a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.979 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c785f6-f650-4b18-b25e-827bff8637c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.980 144128 INFO neutron.agent.ovn.metadata.agent [-] Port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 in datapath c59c8bba-9fc6-441e-8b7d-cd5444901b2a unbound from our chassis#033[00m
Jan 20 09:46:59 np0005588920 systemd[1]: run-netns-ovnmeta\x2dc59c8bba\x2d9fc6\x2d441e\x2d8b7d\x2dcd5444901b2a.mount: Deactivated successfully.
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.981 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c59c8bba-9fc6-441e-8b7d-cd5444901b2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.982 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fc531789-f76d-4097-8084-e3354b1a9d37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.983 144128 INFO neutron.agent.ovn.metadata.agent [-] Port fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 in datapath c59c8bba-9fc6-441e-8b7d-cd5444901b2a unbound from our chassis#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.984 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c59c8bba-9fc6-441e-8b7d-cd5444901b2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:46:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:46:59.984 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5b823116-f182-4e7b-bb57-052d987d2cc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.018 226890 DEBUG nova.compute.manager [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-unplugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.018 226890 DEBUG oslo_concurrency.lockutils [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.019 226890 DEBUG oslo_concurrency.lockutils [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.019 226890 DEBUG oslo_concurrency.lockutils [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.019 226890 DEBUG nova.compute.manager [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-unplugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.019 226890 DEBUG nova.compute.manager [req-124c97ac-28ce-404a-9964-f656e34a7f92 req-629fb6b4-0766-4dfd-b366-204f71ddb4d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-unplugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.179 226890 INFO nova.virt.libvirt.driver [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Deleting instance files /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_del#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.180 226890 INFO nova.virt.libvirt.driver [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Deletion of /var/lib/nova/instances/6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835_del complete#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.233 226890 INFO nova.compute.manager [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Took 0.67 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.233 226890 DEBUG oslo.service.loopingcall [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.233 226890 DEBUG nova.compute.manager [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.234 226890 DEBUG nova.network.neutron [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:47:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:00.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:00 np0005588920 nova_compute[226886]: 2026-01-20 14:47:00.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.459 226890 DEBUG nova.network.neutron [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:01.465 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.478 226890 INFO nova.compute.manager [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Took 1.24 seconds to deallocate network for instance.#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.528 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.529 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.581 226890 DEBUG nova.compute.manager [req-7df2b4ce-7e32-4e42-9208-988f99de8970 req-37b99f0b-ba23-441e-9806-43e0b5643f90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-deleted-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:01.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.639 226890 DEBUG oslo_concurrency.processutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:01 np0005588920 nova_compute[226886]: 2026-01-20 14:47:01.994 226890 DEBUG nova.network.neutron [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.017 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.019 226890 DEBUG nova.compute.manager [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2640005168' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.067 226890 DEBUG oslo_concurrency.processutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.072 226890 DEBUG nova.compute.provider_tree [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.100 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.101 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.101 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.101 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 WARNING nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.102 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 WARNING nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 DEBUG oslo_concurrency.lockutils [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 DEBUG nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] No waiting events found dispatching network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.103 226890 WARNING nova.compute.manager [req-b64f7cad-ccc2-4f50-8bb0-dab6874473d3 req-4fcb4e62-ca8d-4e4c-b348-e7736c127525 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Received unexpected event network-vif-plugged-fc4a2805-d733-4cb9-9ce9-3d4891a1cbc1 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.105 226890 DEBUG nova.scheduler.client.report [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.127 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.163 226890 INFO nova.scheduler.client.report [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Deleted allocations for instance 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835#033[00m
Jan 20 09:47:02 np0005588920 kernel: tapd3a9a684-c9 (unregistering): left promiscuous mode
Jan 20 09:47:02 np0005588920 NetworkManager[49076]: <info>  [1768920422.1713] device (tapd3a9a684-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:47:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:02Z|00413|binding|INFO|Releasing lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 from this chassis (sb_readonly=0)
Jan 20 09:47:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:02Z|00414|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 down in Southbound
Jan 20 09:47:02 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:02Z|00415|binding|INFO|Removing iface tapd3a9a684-c9 ovn-installed in OVS
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.183 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.186 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:02.193 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:02.194 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:47:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:02.196 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:47:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:02.196 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c1eedf86-c510-4aa0-a654-de8074d67521]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:02.197 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.203 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 20 09:47:02 np0005588920 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005e.scope: Consumed 14.194s CPU time.
Jan 20 09:47:02 np0005588920 systemd-machined[196121]: Machine qemu-43-instance-0000005e terminated.
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.257 226890 DEBUG oslo_concurrency.lockutils [None req-260a4e92-3e35-4638-999b-317a8329a6d7 5cd9508688214bedb977528f8b6f95d1 7bbf722f17654404925cfb53e48cd473 - - default default] Lock "6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:02 np0005588920 NetworkManager[49076]: <info>  [1768920422.3356] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 20 09:47:02 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [NOTICE]   (261294) : haproxy version is 2.8.14-c23fe91
Jan 20 09:47:02 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [NOTICE]   (261294) : path to executable is /usr/sbin/haproxy
Jan 20 09:47:02 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [ALERT]    (261294) : Current worker (261297) exited with code 143 (Terminated)
Jan 20 09:47:02 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261288]: [WARNING]  (261294) : All workers exited. Exiting... (0)
Jan 20 09:47:02 np0005588920 systemd[1]: libpod-24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb.scope: Deactivated successfully.
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.354 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.355 226890 DEBUG nova.objects.instance [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:02 np0005588920 podman[261730]: 2026-01-20 14:47:02.356607751 +0000 UTC m=+0.067671237 container died 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.370 226890 DEBUG nova.virt.libvirt.vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.370 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.371 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.371 226890 DEBUG os_vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.373 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.373 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.381 226890 INFO os_vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.388 226890 DEBUG nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start _get_guest_xml network_info=[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.392 226890 WARNING nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.396 226890 DEBUG nova.virt.libvirt.host [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.398 226890 DEBUG nova.virt.libvirt.host [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.401 226890 DEBUG nova.virt.libvirt.host [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.402 226890 DEBUG nova.virt.libvirt.host [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.403 226890 DEBUG nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.403 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.404 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.405 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.405 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.405 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.405 226890 DEBUG nova.virt.hardware [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:47:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.405 226890 DEBUG nova.objects.instance [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:02.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:02 np0005588920 nova_compute[226886]: 2026-01-20 14:47:02.423 226890 DEBUG oslo_concurrency.processutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb-userdata-shm.mount: Deactivated successfully.
Jan 20 09:47:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay-47062e0df0cfa1d0ea402d5c1307c6fe84cf2a90848f6933450d9fe1e2b4a2ef-merged.mount: Deactivated successfully.
Jan 20 09:47:03 np0005588920 podman[261730]: 2026-01-20 14:47:03.198002166 +0000 UTC m=+0.909065662 container cleanup 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:47:03 np0005588920 systemd[1]: libpod-conmon-24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb.scope: Deactivated successfully.
Jan 20 09:47:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1489706811' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.538 226890 DEBUG oslo_concurrency.processutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.580 226890 DEBUG oslo_concurrency.processutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:03 np0005588920 podman[261791]: 2026-01-20 14:47:03.814973154 +0000 UTC m=+0.570621297 container remove 24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.825 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c067a84c-b98b-4545-974c-5f7a179ce992]: (4, ('Tue Jan 20 02:47:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb)\n24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb\nTue Jan 20 02:47:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb)\n24713f5a8234981a08e67f062731cc7923d4a83a0090be88645677080a684ecb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.828 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4f89c80e-c435-43ba-a59d-7d5ec001f9af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.829 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:03 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.832 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.841 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d138d8-296e-4944-99f2-c32f790c0bcd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.856 226890 DEBUG nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.857 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.857 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.857 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.858 226890 DEBUG nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.858 226890 WARNING nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.858 226890 DEBUG nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.858 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.858 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.859 226890 DEBUG oslo_concurrency.lockutils [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.859 226890 DEBUG nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.859 226890 WARNING nova.compute.manager [req-7a2ddcf6-5623-4776-87c9-27fe59271610 req-9d1e5588-1a05-4278-95db-cbef75ae4355 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 09:47:03 np0005588920 nova_compute[226886]: 2026-01-20 14:47:03.859 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.868 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b58126da-c9d6-46ec-9a2a-c7d8490190e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.870 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4de43b24-df6f-4c63-b067-dc9c349984bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.886 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[23e3b599-e2e4-4afe-88ce-6f676a7bf93e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543096, 'reachable_time': 21471, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261844, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.889 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:47:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:03.889 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3df073-46ba-4b8c-84f0-32cc3dc99bc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:03 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:47:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3517147788' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.078 226890 DEBUG oslo_concurrency.processutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.080 226890 DEBUG nova.virt.libvirt.vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.080 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.082 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.083 226890 DEBUG nova.objects.instance [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.105 226890 DEBUG nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <uuid>75736b87-b14e-45b7-b43b-5129cf7d3279</uuid>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <name>instance-0000005e</name>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1202945337</nova:name>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:47:02</nova:creationTime>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <nova:port uuid="d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="serial">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="uuid">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:22:f9:d2"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <target dev="tapd3a9a684-c9"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log" append="off"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:47:04 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:47:04 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:47:04 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:47:04 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.105 226890 DEBUG nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.106 226890 DEBUG nova.virt.libvirt.driver [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.107 226890 DEBUG nova.virt.libvirt.vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.107 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.107 226890 DEBUG nova.network.os_vif_util [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.108 226890 DEBUG os_vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.109 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.109 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.111 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.111 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a9a684-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.111 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a9a684-c9, col_values=(('external_ids', {'iface-id': 'd3a9a684-c9a7-4abc-a085-9dcd17bfc2e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:f9:d2', 'vm-uuid': '75736b87-b14e-45b7-b43b-5129cf7d3279'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.1140] manager: (tapd3a9a684-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.115 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.119 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.120 226890 INFO os_vif [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:47:04 np0005588920 kernel: tapd3a9a684-c9: entered promiscuous mode
Jan 20 09:47:04 np0005588920 systemd-udevd[261845]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.3075] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Jan 20 09:47:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:04Z|00416|binding|INFO|Claiming lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for this chassis.
Jan 20 09:47:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:04Z|00417|binding|INFO|d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6: Claiming fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.3163] device (tapd3a9a684-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.3167] device (tapd3a9a684-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:47:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:04Z|00418|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 ovn-installed in OVS
Jan 20 09:47:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:04Z|00419|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 up in Southbound
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.323 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '9', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.325 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.328 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:47:04 np0005588920 systemd-machined[196121]: New machine qemu-44-instance-0000005e.
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.339 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a96b0d1f-3325-4789-b2ad-d317b1c1e38c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.340 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.342 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.342 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c30d3354-cc6c-4eca-89d3-90c6ce8dbeec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.343 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c76743e9-fba4-4a48-8df1-87575027376c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.353 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[900a0ef2-59cd-464b-b83e-cfcfff46aeba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 systemd[1]: Started Virtual Machine qemu-44-instance-0000005e.
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.375 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[372c1603-af95-401b-961b-33b18e8fe02a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.402 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3255c08d-0790-4809-8db3-4fbfc10b4c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:04.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.4110] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/215)
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.410 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[98fe49c4-7724-4a76-8da2-60aac4d9c543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.445 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[debc1d9f-0566-42c6-aa4c-e2fbc24321ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.448 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef4b6ca-6179-4853-9208-64a0130050aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.4690] device (tap762e1859-40): carrier: link connected
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.475 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8109b6-4b72-4646-9c23-97fa3d742d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.490 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[34250c27-107b-4511-a54e-3be9c052b549]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546615, 'reachable_time': 44908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261899, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.503 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17b79cf1-155e-4c6c-b16d-f2f045083741]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546615, 'tstamp': 546615}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261900, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.518 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8b533a-15a9-4aaa-844b-006931c1ad9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546615, 'reachable_time': 44908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261901, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.546 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fae75645-1005-4dc2-bb29-4bf0e83811a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.608 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44a639ff-a96d-4f07-83fc-ad23dd915ff0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.609 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.609 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.609 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:47:04 np0005588920 NetworkManager[49076]: <info>  [1768920424.6119] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.617 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:04Z|00420|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.618 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.621 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.622 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[10fe5d7d-ba3e-4d1a-9277-f76d968e5f08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.622 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:47:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:04.623 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.767 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 75736b87-b14e-45b7-b43b-5129cf7d3279 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.767 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920424.766736, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.767 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.770 226890 DEBUG nova.compute.manager [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.772 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance rebooted successfully.#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.773 226890 DEBUG nova.compute.manager [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.799 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.802 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.830 226890 DEBUG oslo_concurrency.lockutils [None req-c3408307-4ead-4159-a942-8e4a77aee88f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.832 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920424.7675211, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.832 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Started (Lifecycle Event)#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.861 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:04 np0005588920 nova_compute[226886]: 2026-01-20 14:47:04.871 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:05 np0005588920 podman[261975]: 2026-01-20 14:47:05.065849254 +0000 UTC m=+0.112026634 container create 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:47:05 np0005588920 podman[261975]: 2026-01-20 14:47:04.975064183 +0000 UTC m=+0.021241583 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:47:05 np0005588920 systemd[1]: Started libpod-conmon-9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f.scope.
Jan 20 09:47:05 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:47:05 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d27fe4a6de03d9b0a6b5830b438afe51369c32f43e19de3a2a41c5bd986d7b15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:47:05 np0005588920 podman[261975]: 2026-01-20 14:47:05.263554205 +0000 UTC m=+0.309731615 container init 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:47:05 np0005588920 podman[261975]: 2026-01-20 14:47:05.269616154 +0000 UTC m=+0.315793534 container start 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:47:05 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [NOTICE]   (261993) : New worker (261995) forked
Jan 20 09:47:05 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [NOTICE]   (261993) : Loading success.
Jan 20 09:47:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:05.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.954 226890 DEBUG nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.955 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.955 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.955 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.955 226890 DEBUG nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.956 226890 WARNING nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.956 226890 DEBUG nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.956 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.956 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.956 226890 DEBUG oslo_concurrency.lockutils [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.957 226890 DEBUG nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:05 np0005588920 nova_compute[226886]: 2026-01-20 14:47:05.957 226890 WARNING nova.compute.manager [req-6860a240-6678-4d38-ad32-293b650352d9 req-d86dd513-65c3-4229-be2a-a2560ac54185 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:47:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:06.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:06Z|00421|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:47:06 np0005588920 nova_compute[226886]: 2026-01-20 14:47:06.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:07 np0005588920 podman[262004]: 2026-01-20 14:47:07.034666047 +0000 UTC m=+0.115538292 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:47:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:47:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/724914289' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:47:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:47:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/724914289' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:47:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:08.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:09 np0005588920 nova_compute[226886]: 2026-01-20 14:47:09.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:09.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:10.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:10 np0005588920 nova_compute[226886]: 2026-01-20 14:47:10.642 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:11Z|00422|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:47:11 np0005588920 nova_compute[226886]: 2026-01-20 14:47:11.589 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:11.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:12.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:47:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1676694116' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:47:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:47:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1676694116' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:47:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:13.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.421 226890 DEBUG nova.compute.manager [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:47:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.006000167s ======
Jan 20 09:47:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:14.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.006000167s
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.541 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.542 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.567 226890 DEBUG nova.objects.instance [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'pci_requests' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.584 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.585 226890 INFO nova.compute.claims [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.585 226890 DEBUG nova.objects.instance [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'resources' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.596 226890 DEBUG nova.objects.instance [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.645 226890 INFO nova.compute.resource_tracker [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating resource usage from migration a1059fde-04ca-4e1c-8ccd-474c4cd4cbec#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.646 226890 DEBUG nova.compute.resource_tracker [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Starting to track incoming migration a1059fde-04ca-4e1c-8ccd-474c4cd4cbec with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.758 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.791 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920419.7891772, 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.792 226890 INFO nova.compute.manager [-] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.814 226890 DEBUG nova.compute.manager [None req-5aeed895-a8a4-499a-a7cf-b25b9cec374d - - - - - -] [instance: 6d9b0e60-8e92-4c9a-8aab-1b7f44bb2835] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:14Z|00423|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:47:14 np0005588920 nova_compute[226886]: 2026-01-20 14:47:14.956 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/279901560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.204 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.209 226890 DEBUG nova.compute.provider_tree [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.233 226890 DEBUG nova.scheduler.client.report [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.261 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.261 226890 INFO nova.compute.manager [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Migrating#033[00m
Jan 20 09:47:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:15.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:15 np0005588920 nova_compute[226886]: 2026-01-20 14:47:15.645 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:16.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:16.450 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:16 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:16Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:47:17 np0005588920 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:47:17 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:47:17 np0005588920 systemd-logind[783]: New session 54 of user nova.
Jan 20 09:47:17 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:47:17 np0005588920 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:47:17 np0005588920 podman[262056]: 2026-01-20 14:47:17.345338167 +0000 UTC m=+0.091969974 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:47:17 np0005588920 systemd[262074]: Queued start job for default target Main User Target.
Jan 20 09:47:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:17 np0005588920 systemd[262074]: Created slice User Application Slice.
Jan 20 09:47:17 np0005588920 systemd[262074]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:47:17 np0005588920 systemd[262074]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:47:17 np0005588920 systemd[262074]: Reached target Paths.
Jan 20 09:47:17 np0005588920 systemd[262074]: Reached target Timers.
Jan 20 09:47:17 np0005588920 systemd[262074]: Starting D-Bus User Message Bus Socket...
Jan 20 09:47:17 np0005588920 systemd[262074]: Starting Create User's Volatile Files and Directories...
Jan 20 09:47:17 np0005588920 systemd[262074]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:47:17 np0005588920 systemd[262074]: Finished Create User's Volatile Files and Directories.
Jan 20 09:47:17 np0005588920 systemd[262074]: Reached target Sockets.
Jan 20 09:47:17 np0005588920 systemd[262074]: Reached target Basic System.
Jan 20 09:47:17 np0005588920 systemd[262074]: Reached target Main User Target.
Jan 20 09:47:17 np0005588920 systemd[262074]: Startup finished in 199ms.
Jan 20 09:47:17 np0005588920 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:47:17 np0005588920 systemd[1]: Started Session 54 of User nova.
Jan 20 09:47:17 np0005588920 systemd[1]: session-54.scope: Deactivated successfully.
Jan 20 09:47:17 np0005588920 systemd-logind[783]: Session 54 logged out. Waiting for processes to exit.
Jan 20 09:47:17 np0005588920 systemd-logind[783]: Removed session 54.
Jan 20 09:47:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:17.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:17 np0005588920 systemd-logind[783]: New session 56 of user nova.
Jan 20 09:47:17 np0005588920 systemd[1]: Started Session 56 of User nova.
Jan 20 09:47:17 np0005588920 systemd[1]: session-56.scope: Deactivated successfully.
Jan 20 09:47:17 np0005588920 systemd-logind[783]: Session 56 logged out. Waiting for processes to exit.
Jan 20 09:47:17 np0005588920 systemd-logind[783]: Removed session 56.
Jan 20 09:47:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:18.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:19 np0005588920 nova_compute[226886]: 2026-01-20 14:47:19.120 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:19.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:20.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:20 np0005588920 nova_compute[226886]: 2026-01-20 14:47:20.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:21.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:22.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 20 09:47:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:47:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:23 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:47:23 np0005588920 nova_compute[226886]: 2026-01-20 14:47:23.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:23.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:24 np0005588920 nova_compute[226886]: 2026-01-20 14:47:24.167 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:24.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:25.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:25 np0005588920 nova_compute[226886]: 2026-01-20 14:47:25.650 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:26.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:27.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:27 np0005588920 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:47:27 np0005588920 systemd[262074]: Activating special unit Exit the Session...
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped target Main User Target.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped target Basic System.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped target Paths.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped target Sockets.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped target Timers.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:47:27 np0005588920 systemd[262074]: Closed D-Bus User Message Bus Socket.
Jan 20 09:47:27 np0005588920 systemd[262074]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:47:27 np0005588920 systemd[262074]: Removed slice User Application Slice.
Jan 20 09:47:27 np0005588920 systemd[262074]: Reached target Shutdown.
Jan 20 09:47:27 np0005588920 systemd[262074]: Finished Exit the Session.
Jan 20 09:47:27 np0005588920 systemd[262074]: Reached target Exit the Session.
Jan 20 09:47:27 np0005588920 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:47:27 np0005588920 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:47:27 np0005588920 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:47:27 np0005588920 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:47:27 np0005588920 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:47:27 np0005588920 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:47:27 np0005588920 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:47:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:28.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.168 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:29.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.955 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.955 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.955 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:47:29 np0005588920 nova_compute[226886]: 2026-01-20 14:47:29.956 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:30.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:30 np0005588920 nova_compute[226886]: 2026-01-20 14:47:30.576 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:30 np0005588920 nova_compute[226886]: 2026-01-20 14:47:30.651 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.334 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.360 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.361 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.419 226890 INFO nova.network.neutron [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating port 5659965f-0485-4982-898c-f273d7898a5f with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:47:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:31.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.759 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.760 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.776 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.873 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.874 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.881 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.881 226890 INFO nova.compute.claims [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.949 226890 DEBUG nova.compute.manager [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.949 226890 DEBUG oslo_concurrency.lockutils [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.949 226890 DEBUG oslo_concurrency.lockutils [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.950 226890 DEBUG oslo_concurrency.lockutils [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.950 226890 DEBUG nova.compute.manager [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:31 np0005588920 nova_compute[226886]: 2026-01-20 14:47:31.950 226890 WARNING nova.compute.manager [req-f9ff71d0-fab8-4acf-ba20-f246357d4344 req-918284ae-e991-41bb-8d2b-bab258f6aeae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.009 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3143770173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.431 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.437 226890 DEBUG nova.compute.provider_tree [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:32.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.454 226890 DEBUG nova.scheduler.client.report [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.478 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.479 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:47:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.531 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.531 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.556 226890 INFO nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.580 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.681 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.682 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquired lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.682 226890 DEBUG nova.network.neutron [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.685 226890 INFO nova.virt.block_device [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Booting with volume 9219aafd-6c66-4f38-9927-85b54b4175ae at /dev/vda#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.810 226890 DEBUG nova.policy [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '869086208e10436c9dc96c78bee9a85d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b683fcc0026242e28ba6d8fba638688e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.879 226890 DEBUG os_brick.utils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.881 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.892 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.893 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[d91e453e-f346-4aea-ae05-bf8b59a80b98]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.894 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.902 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.903 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d31448-ec10-44ea-9474-2dd70848fc3f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.904 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.918 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.918 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[64bc5351-28cd-46f1-b7a7-f629c1d79cb6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.920 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[9f11dbe7-2520-4692-b5ae-d4df524b2d75]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.920 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.949 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.953 226890 DEBUG os_brick.initiator.connectors.lightos [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.954 226890 DEBUG os_brick.initiator.connectors.lightos [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.954 226890 DEBUG os_brick.initiator.connectors.lightos [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.954 226890 DEBUG os_brick.utils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] <== get_connector_properties: return (74ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:47:32 np0005588920 nova_compute[226886]: 2026-01-20 14:47:32.955 226890 DEBUG nova.virt.block_device [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating existing volume attachment record: 44ad9b49-9629-4243-9bc6-4177f69f7660 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:47:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2578026580' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:33.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.916 226890 DEBUG nova.compute.manager [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.917 226890 DEBUG oslo_concurrency.lockutils [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.917 226890 DEBUG oslo_concurrency.lockutils [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.917 226890 DEBUG oslo_concurrency.lockutils [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.917 226890 DEBUG nova.compute.manager [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.917 226890 WARNING nova.compute.manager [req-bdbb8a28-8748-42d7-9f0d-ec50d6672627 req-d1fa5ba2-d9ec-4f61-a344-41746951dced 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.988 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.989 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.990 226890 INFO nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Creating image(s)#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.990 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.991 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Ensure instance console log exists: /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.991 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.991 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:33 np0005588920 nova_compute[226886]: 2026-01-20 14:47:33.992 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:34 np0005588920 nova_compute[226886]: 2026-01-20 14:47:34.170 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:34.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.517512) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454517852, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1636, "num_deletes": 254, "total_data_size": 3488221, "memory_usage": 3538176, "flush_reason": "Manual Compaction"}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454528126, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1416448, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41556, "largest_seqno": 43186, "table_properties": {"data_size": 1411216, "index_size": 2436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14662, "raw_average_key_size": 21, "raw_value_size": 1399361, "raw_average_value_size": 2031, "num_data_blocks": 108, "num_entries": 689, "num_filter_entries": 689, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920327, "oldest_key_time": 1768920327, "file_creation_time": 1768920454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 10656 microseconds, and 5562 cpu microseconds.
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.528166) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1416448 bytes OK
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.528183) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.529540) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.529556) EVENT_LOG_v1 {"time_micros": 1768920454529551, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.529572) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3480631, prev total WAL file size 3480631, number of live WAL files 2.
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.530916) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323533' seq:72057594037927935, type:22 .. '6D6772737461740031353036' seq:0, type:0; will stop at (end)
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1383KB)], [78(10MB)]
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454530988, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12678312, "oldest_snapshot_seqno": -1}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6918 keys, 9676535 bytes, temperature: kUnknown
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454596257, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9676535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9632340, "index_size": 25779, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 176823, "raw_average_key_size": 25, "raw_value_size": 9510694, "raw_average_value_size": 1374, "num_data_blocks": 1026, "num_entries": 6918, "num_filter_entries": 6918, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.596524) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9676535 bytes
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.649433) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.9 rd, 148.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.7 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(15.8) write-amplify(6.8) OK, records in: 7386, records dropped: 468 output_compression: NoCompression
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.649469) EVENT_LOG_v1 {"time_micros": 1768920454649456, "job": 48, "event": "compaction_finished", "compaction_time_micros": 65373, "compaction_time_cpu_micros": 25344, "output_level": 6, "num_output_files": 1, "total_output_size": 9676535, "num_input_records": 7386, "num_output_records": 6918, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454649906, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454651936, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.530797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.651979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.651983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.652220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.652225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:47:34.652227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588920 nova_compute[226886]: 2026-01-20 14:47:34.721 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Successfully created port: efc8b363-e70d-42f6-9be8-99865e269ec9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:47:34 np0005588920 nova_compute[226886]: 2026-01-20 14:47:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.022 226890 DEBUG nova.network.neutron [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating instance_info_cache with network_info: [{"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.058 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Releasing lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.131 226890 DEBUG nova.compute.manager [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-changed-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.132 226890 DEBUG nova.compute.manager [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Refreshing instance network info cache due to event network-changed-5659965f-0485-4982-898c-f273d7898a5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.132 226890 DEBUG oslo_concurrency.lockutils [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.132 226890 DEBUG oslo_concurrency.lockutils [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.132 226890 DEBUG nova.network.neutron [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Refreshing network info cache for port 5659965f-0485-4982-898c-f273d7898a5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.154 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.155 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.156 226890 INFO nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Creating image(s)#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.193 226890 DEBUG nova.storage.rbd_utils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] creating snapshot(nova-resize) on rbd image(bf7690ac-9b5a-41e3-83bf-3c83cbacc45c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:35.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.753 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.753 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.754 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.754 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.902 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Successfully updated port: efc8b363-e70d-42f6-9be8-99865e269ec9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.920 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.921 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:35 np0005588920 nova_compute[226886]: 2026-01-20 14:47:35.921 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.194 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.279 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.279 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.439 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.440 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4329MB free_disk=20.806148529052734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.440 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.441 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:36.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.522 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Applying migration context for instance bf7690ac-9b5a-41e3-83bf-3c83cbacc45c as it has an incoming, in-progress migration a1059fde-04ca-4e1c-8ccd-474c4cd4cbec. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.523 226890 INFO nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating resource usage from migration a1059fde-04ca-4e1c-8ccd-474c4cd4cbec#033[00m
Jan 20 09:47:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.555 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 75736b87-b14e-45b7-b43b-5129cf7d3279 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.556 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance bf7690ac-9b5a-41e3-83bf-3c83cbacc45c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.556 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 9beb3ec3-721e-4919-9713-a92c82ad189b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.556 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.557 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.584 226890 DEBUG nova.objects.instance [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.641 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.698 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.721 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.722 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Ensure instance console log exists: /var/lib/nova/instances/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.723 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.723 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.724 226890 DEBUG oslo_concurrency.lockutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.727 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Start _get_guest_xml network_info=[{"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-60721994-network", "vif_mac": "fa:16:3e:b7:cb:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.731 226890 WARNING nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.740 226890 DEBUG nova.virt.libvirt.host [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.741 226890 DEBUG nova.virt.libvirt.host [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.745 226890 DEBUG nova.virt.libvirt.host [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.745 226890 DEBUG nova.virt.libvirt.host [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.746 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.746 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.747 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.747 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.747 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.747 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.748 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.748 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.748 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.748 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.748 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.749 226890 DEBUG nova.virt.hardware [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.749 226890 DEBUG nova.objects.instance [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:36 np0005588920 nova_compute[226886]: 2026-01-20 14:47:36.773 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1261327825' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.137 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.147 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3809842418' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.261 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.300 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.327 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.336 226890 DEBUG nova.compute.manager [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.336 226890 DEBUG nova.compute.manager [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing instance network info cache due to event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.337 226890 DEBUG oslo_concurrency.lockutils [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.356 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.357 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.363 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.364 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.365 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2610479657' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.727 226890 DEBUG nova.network.neutron [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updated VIF entry in instance network info cache for port 5659965f-0485-4982-898c-f273d7898a5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.728 226890 DEBUG nova.network.neutron [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating instance_info_cache with network_info: [{"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.742 226890 DEBUG oslo_concurrency.processutils [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.743 226890 DEBUG nova.virt.libvirt.vif [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1331472194',display_name='tempest-DeleteServersTestJSON-server-1331472194',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1331472194',id=99,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-5fm4q1fz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:31Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=bf7690ac-9b5a-41e3-83bf-3c83cbacc45c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-60721994-network", "vif_mac": "fa:16:3e:b7:cb:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.744 226890 DEBUG nova.network.os_vif_util [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-60721994-network", "vif_mac": "fa:16:3e:b7:cb:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.744 226890 DEBUG nova.network.os_vif_util [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.747 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <uuid>bf7690ac-9b5a-41e3-83bf-3c83cbacc45c</uuid>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <name>instance-00000063</name>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <memory>196608</memory>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:name>tempest-DeleteServersTestJSON-server-1331472194</nova:name>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:47:36</nova:creationTime>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.micro">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:memory>192</nova:memory>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:user uuid="37e9ef97fbe0448e9fbe32d48b66211f">tempest-DeleteServersTestJSON-1162922273-project-member</nova:user>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:project uuid="3b31139b2a4e49cba5e7048febf901c4">tempest-DeleteServersTestJSON-1162922273</nova:project>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <nova:port uuid="5659965f-0485-4982-898c-f273d7898a5f">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="serial">bf7690ac-9b5a-41e3-83bf-3c83cbacc45c</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="uuid">bf7690ac-9b5a-41e3-83bf-3c83cbacc45c</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c_disk">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c_disk.config">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:b7:cb:a9"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <target dev="tap5659965f-04"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c/console.log" append="off"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:47:37 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:47:37 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:47:37 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:47:37 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.750 226890 DEBUG nova.virt.libvirt.vif [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1331472194',display_name='tempest-DeleteServersTestJSON-server-1331472194',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1331472194',id=99,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-5fm4q1fz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:31Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=bf7690ac-9b5a-41e3-83bf-3c83cbacc45c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-60721994-network", "vif_mac": "fa:16:3e:b7:cb:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.751 226890 DEBUG nova.network.os_vif_util [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-60721994-network", "vif_mac": "fa:16:3e:b7:cb:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.752 226890 DEBUG nova.network.os_vif_util [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.752 226890 DEBUG os_vif [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.753 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.754 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.754 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.756 226890 DEBUG oslo_concurrency.lockutils [req-f7f44aec-103d-4b07-abc4-ce3d2c461c1e req-3f3a5da4-66fd-4665-bee4-62c79d00457b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.757 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.758 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5659965f-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.758 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5659965f-04, col_values=(('external_ids', {'iface-id': '5659965f-0485-4982-898c-f273d7898a5f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:cb:a9', 'vm-uuid': 'bf7690ac-9b5a-41e3-83bf-3c83cbacc45c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.759 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 NetworkManager[49076]: <info>  [1768920457.7607] manager: (tap5659965f-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.762 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.768 226890 INFO os_vif [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04')#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.841 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.841 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.842 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] No VIF found with MAC fa:16:3e:b7:cb:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.842 226890 INFO nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Using config drive#033[00m
Jan 20 09:47:37 np0005588920 NetworkManager[49076]: <info>  [1768920457.9260] manager: (tap5659965f-04): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Jan 20 09:47:37 np0005588920 kernel: tap5659965f-04: entered promiscuous mode
Jan 20 09:47:37 np0005588920 systemd-udevd[262528]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:37Z|00424|binding|INFO|Claiming lport 5659965f-0485-4982-898c-f273d7898a5f for this chassis.
Jan 20 09:47:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:37Z|00425|binding|INFO|5659965f-0485-4982-898c-f273d7898a5f: Claiming fa:16:3e:b7:cb:a9 10.100.0.4
Jan 20 09:47:37 np0005588920 NetworkManager[49076]: <info>  [1768920457.9710] device (tap5659965f-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:47:37 np0005588920 NetworkManager[49076]: <info>  [1768920457.9716] device (tap5659965f-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.981 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:cb:a9 10.100.0.4'], port_security=['fa:16:3e:b7:cb:a9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bf7690ac-9b5a-41e3-83bf-3c83cbacc45c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5659965f-0485-4982-898c-f273d7898a5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.983 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5659965f-0485-4982-898c-f273d7898a5f in datapath fbd5d614-a7d3-4563-913c-104506628e59 bound to our chassis#033[00m
Jan 20 09:47:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:37Z|00426|binding|INFO|Setting lport 5659965f-0485-4982-898c-f273d7898a5f ovn-installed in OVS
Jan 20 09:47:37 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:37Z|00427|binding|INFO|Setting lport 5659965f-0485-4982-898c-f273d7898a5f up in Southbound
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.985 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 nova_compute[226886]: 2026-01-20 14:47:37.986 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.985 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd5d614-a7d3-4563-913c-104506628e59#033[00m
Jan 20 09:47:37 np0005588920 systemd-machined[196121]: New machine qemu-45-instance-00000063.
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.997 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aef22bca-ec2b-4f00-86cc-47500d579924]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:37.998 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbd5d614-a1 in ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.000 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbd5d614-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.000 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[db1faf97-7e47-46ff-bcb3-44db720fb0a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.001 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26a49751-091d-4711-ad57-04ab5f066aa5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 systemd[1]: Started Virtual Machine qemu-45-instance-00000063.
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.015 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5fada0d4-0372-4b02-94d9-3645fee9d2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.028 226890 DEBUG nova.network.neutron [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.033 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[59006534-fbd1-4e95-96ec-3ba62e6aee9b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.045 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.046 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance network_info: |[{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.046 226890 DEBUG oslo_concurrency.lockutils [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.046 226890 DEBUG nova.network.neutron [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.050 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start _get_guest_xml network_info=[{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': True, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9219aafd-6c66-4f38-9927-85b54b4175ae', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9219aafd-6c66-4f38-9927-85b54b4175ae', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'attached_at': '', 'detached_at': '', 'volume_id': '9219aafd-6c66-4f38-9927-85b54b4175ae', 'serial': '9219aafd-6c66-4f38-9927-85b54b4175ae'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '44ad9b49-9629-4243-9bc6-4177f69f7660', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.059 226890 WARNING nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.068 226890 DEBUG nova.virt.libvirt.host [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.069 226890 DEBUG nova.virt.libvirt.host [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.073 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d2538085-6250-4960-8d80-4426cbddf21f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.074 226890 DEBUG nova.virt.libvirt.host [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.075 226890 DEBUG nova.virt.libvirt.host [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.076 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.076 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.077 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.077 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.077 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.077 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.077 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.078 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.078 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.078 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.078 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.078 226890 DEBUG nova.virt.hardware [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.079 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ea116a6f-95e5-49b3-803d-24385c180c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 NetworkManager[49076]: <info>  [1768920458.0807] manager: (tapfbd5d614-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/219)
Jan 20 09:47:38 np0005588920 podman[262508]: 2026-01-20 14:47:38.086430307 +0000 UTC m=+0.174585787 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.116 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2be2c109-f53e-49f3-b87c-e92c99caa02b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.120 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e301c8a6-99e9-42e5-a2d0-9010cd9aedd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.126 226890 DEBUG nova.storage.rbd_utils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.131 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:38 np0005588920 NetworkManager[49076]: <info>  [1768920458.1433] device (tapfbd5d614-a0): carrier: link connected
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.148 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7b6582-83d8-4a3f-becf-dcb76ff8e08c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.165 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2268e28-4409-4ace-b9e9-33791a383387]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549982, 'reachable_time': 18159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262598, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.184 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb207f9a-12b5-4e68-83bd-8f8b0e9feecd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:38be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549982, 'tstamp': 549982}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262599, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9a03ffc8-1526-4e75-b31c-6ebbd4e4ef62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd5d614-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:38:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549982, 'reachable_time': 18159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262600, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.253 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd97fcf-b5e9-4183-a1ce-c5e6d63f6a43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.312 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[52610c5f-fb19-465a-921e-90a19ec32ea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.314 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.314 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.315 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd5d614-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:38 np0005588920 kernel: tapfbd5d614-a0: entered promiscuous mode
Jan 20 09:47:38 np0005588920 NetworkManager[49076]: <info>  [1768920458.3176] manager: (tapfbd5d614-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.319 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.321 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd5d614-a0, col_values=(('external_ids', {'iface-id': 'b370b74e-dca0-4ff7-a96f-85b392e20721'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.322 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:38 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:38Z|00428|binding|INFO|Releasing lport b370b74e-dca0-4ff7-a96f-85b392e20721 from this chassis (sb_readonly=0)
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.324 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.324 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[52df1df0-2d43-4825-8ccc-2d449f7dddf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.325 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/fbd5d614-a7d3-4563-913c-104506628e59.pid.haproxy
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID fbd5d614-a7d3-4563-913c-104506628e59
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:47:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:38.328 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'env', 'PROCESS_TAG=haproxy-fbd5d614-a7d3-4563-913c-104506628e59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbd5d614-a7d3-4563-913c-104506628e59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.337 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.357 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.357 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.357 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.358 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.427 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920458.4274888, bf7690ac-9b5a-41e3-83bf-3c83cbacc45c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.428 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.431 226890 DEBUG nova.compute.manager [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.434 226890 INFO nova.virt.libvirt.driver [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Instance running successfully.#033[00m
Jan 20 09:47:38 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.442 226890 DEBUG nova.virt.libvirt.guest [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.443 226890 DEBUG nova.virt.libvirt.driver [None req-2e9a95ec-dc91-4aa4-95a4-8879707ac308 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:47:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:38.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.455 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.459 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.496 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.496 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920458.4317842, bf7690ac-9b5a-41e3-83bf-3c83cbacc45c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.496 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] VM Started (Lifecycle Event)#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.542 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.547 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1209793672' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:38 np0005588920 nova_compute[226886]: 2026-01-20 14:47:38.607 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:38 np0005588920 podman[262694]: 2026-01-20 14:47:38.702265925 +0000 UTC m=+0.048582966 container create 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:47:38 np0005588920 systemd[1]: Started libpod-conmon-02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005.scope.
Jan 20 09:47:38 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:47:38 np0005588920 podman[262694]: 2026-01-20 14:47:38.677186466 +0000 UTC m=+0.023503517 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:47:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/269d97b051d0a627ede9217472492a1759cef0aba2acb311a14ac2c0c4af46c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:47:38 np0005588920 podman[262694]: 2026-01-20 14:47:38.788854858 +0000 UTC m=+0.135171929 container init 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 09:47:38 np0005588920 podman[262694]: 2026-01-20 14:47:38.795525114 +0000 UTC m=+0.141842175 container start 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:47:38 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [NOTICE]   (262715) : New worker (262717) forked
Jan 20 09:47:38 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [NOTICE]   (262715) : Loading success.
Jan 20 09:47:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:39.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.077 226890 DEBUG nova.virt.libvirt.vif [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.078 226890 DEBUG nova.network.os_vif_util [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.079 226890 DEBUG nova.network.os_vif_util [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.080 226890 DEBUG nova.objects.instance [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.086 226890 DEBUG nova.compute.manager [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.087 226890 DEBUG oslo_concurrency.lockutils [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.087 226890 DEBUG oslo_concurrency.lockutils [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.087 226890 DEBUG oslo_concurrency.lockutils [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.088 226890 DEBUG nova.compute.manager [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.088 226890 WARNING nova.compute.manager [req-e5d6d3e2-93ab-47ad-a0ea-59b1a68ded25 req-d619e353-1830-45b0-a608-bce4a2a63b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.105 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <uuid>9beb3ec3-721e-4919-9713-a92c82ad189b</uuid>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <name>instance-00000065</name>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestOtherA-server-757916410</nova:name>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:47:38</nova:creationTime>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <nova:port uuid="efc8b363-e70d-42f6-9be8-99865e269ec9">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="serial">9beb3ec3-721e-4919-9713-a92c82ad189b</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="uuid">9beb3ec3-721e-4919-9713-a92c82ad189b</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-9219aafd-6c66-4f38-9927-85b54b4175ae">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <serial>9219aafd-6c66-4f38-9927-85b54b4175ae</serial>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:36:66:1d"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <target dev="tapefc8b363-e7"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/console.log" append="off"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:47:40 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:47:40 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:47:40 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:47:40 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.105 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Preparing to wait for external event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.106 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.106 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.106 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.107 226890 DEBUG nova.virt.libvirt.vif [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.107 226890 DEBUG nova.network.os_vif_util [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.108 226890 DEBUG nova.network.os_vif_util [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.108 226890 DEBUG os_vif [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.109 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.109 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.112 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.112 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefc8b363-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.113 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefc8b363-e7, col_values=(('external_ids', {'iface-id': 'efc8b363-e70d-42f6-9be8-99865e269ec9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:66:1d', 'vm-uuid': '9beb3ec3-721e-4919-9713-a92c82ad189b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.114 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:40 np0005588920 NetworkManager[49076]: <info>  [1768920460.1163] manager: (tapefc8b363-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.123 226890 INFO os_vif [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7')#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.201 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.202 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.202 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:36:66:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.202 226890 INFO nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Using config drive#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.225 226890 DEBUG nova.storage.rbd_utils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:40.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.655 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.900 226890 INFO nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Creating config drive at /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config#033[00m
Jan 20 09:47:40 np0005588920 nova_compute[226886]: 2026-01-20 14:47:40.905 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpii650oer execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.036 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpii650oer" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.063 226890 DEBUG nova.storage.rbd_utils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.067 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config 9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.249 226890 DEBUG oslo_concurrency.processutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config 9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.250 226890 INFO nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Deleting local config drive /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:47:41 np0005588920 kernel: tapefc8b363-e7: entered promiscuous mode
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.2896] manager: (tapefc8b363-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 20 09:47:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:41Z|00429|binding|INFO|Claiming lport efc8b363-e70d-42f6-9be8-99865e269ec9 for this chassis.
Jan 20 09:47:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:41Z|00430|binding|INFO|efc8b363-e70d-42f6-9be8-99865e269ec9: Claiming fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:41Z|00431|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 ovn-installed in OVS
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588920 systemd-udevd[262799]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.324 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588920 systemd-machined[196121]: New machine qemu-46-instance-00000065.
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.3354] device (tapefc8b363-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.3360] device (tapefc8b363-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:47:41 np0005588920 systemd[1]: Started Virtual Machine qemu-46-instance-00000065.
Jan 20 09:47:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:41Z|00432|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 up in Southbound
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.511 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:66:1d 10.100.0.8'], port_security=['fa:16:3e:36:66:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=efc8b363-e70d-42f6-9be8-99865e269ec9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.513 144128 INFO neutron.agent.ovn.metadata.agent [-] Port efc8b363-e70d-42f6-9be8-99865e269ec9 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.515 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.528 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3561c0d1-d93c-4619-8f2d-94b304f4181f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.529 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa19e9d1a-81 in ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.532 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa19e9d1a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.532 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6d93af54-79b8-49b1-93a5-210df33fb3c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.533 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[edc0d24f-e44c-4189-9bea-8ec35675da4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.550 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[04ba716d-2169-4158-a970-f87d1e2d50b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.574 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e3b9f8-35b9-4ec5-abb3-71b2b7aa3426]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.604 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[dbadcc2b-3b5c-496e-9033-384c9be4068e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.6125] manager: (tapa19e9d1a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.611 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f3664006-8582-44aa-8355-a083792d01a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.643 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[794252ac-02b1-4733-b755-65779c994949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.646 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[51b7cb2a-cc6d-4858-9202-3c12fa2c074a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:41.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.6705] device (tapa19e9d1a-80): carrier: link connected
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.680 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8c61aa62-b25a-45f2-9fad-bd12c1e7bd44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.701 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60b256f8-9a01-40ca-ad01-66ff849170c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550335, 'reachable_time': 40239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262833, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.716 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[56d82fcd-3544-4988-9e0f-9da17972e2cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550335, 'tstamp': 550335}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262834, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.732 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3f71d207-b2cd-4661-9982-58f4c5983572]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550335, 'reachable_time': 40239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262835, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.760 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9b736818-0dbc-417b-b6ed-18b467cd3452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.832 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[018d941f-bcdd-4558-b059-7d152f23a784]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.837 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.837 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.838 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:41 np0005588920 NetworkManager[49076]: <info>  [1768920461.8405] manager: (tapa19e9d1a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 20 09:47:41 np0005588920 kernel: tapa19e9d1a-80: entered promiscuous mode
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.844 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:41 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:41Z|00433|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.851 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588920 nova_compute[226886]: 2026-01-20 14:47:41.858 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.860 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.860 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[562fd2ca-77de-410e-b8fa-e9fc44dbc804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.861 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:47:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:41.862 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'env', 'PROCESS_TAG=haproxy-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a19e9d1a-864f-41ee-bdea-188e65973ea5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.197 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920462.1969175, 9beb3ec3-721e-4919-9713-a92c82ad189b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.198 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.233 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.237 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920462.1969948, 9beb3ec3-721e-4919-9713-a92c82ad189b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.238 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:47:42 np0005588920 podman[262907]: 2026-01-20 14:47:42.243351166 +0000 UTC m=+0.081380559 container create 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.265 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.268 226890 DEBUG nova.network.neutron [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updated VIF entry in instance network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.269 226890 DEBUG nova.network.neutron [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.271 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:42 np0005588920 systemd[1]: Started libpod-conmon-7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38.scope.
Jan 20 09:47:42 np0005588920 podman[262907]: 2026-01-20 14:47:42.195600275 +0000 UTC m=+0.033629688 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.292 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.299 226890 DEBUG oslo_concurrency.lockutils [req-fe9d7156-810e-4276-949c-8d24315706c7 req-d2c61e83-0431-442b-8d84-054956588631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:42 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:47:42 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77021e52f09fab56d798e398bd9c68349b9a361e1e99edb6e6e5b24031504467/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:47:42 np0005588920 podman[262907]: 2026-01-20 14:47:42.325032373 +0000 UTC m=+0.163061786 container init 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:47:42 np0005588920 podman[262907]: 2026-01-20 14:47:42.329673922 +0000 UTC m=+0.167703325 container start 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:47:42 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [NOTICE]   (262927) : New worker (262929) forked
Jan 20 09:47:42 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [NOTICE]   (262927) : Loading success.
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.394 226890 DEBUG nova.compute.manager [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.394 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.394 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.395 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.395 226890 DEBUG nova.compute.manager [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.395 226890 WARNING nova.compute.manager [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.395 226890 DEBUG nova.compute.manager [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.395 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.396 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.396 226890 DEBUG oslo_concurrency.lockutils [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.396 226890 DEBUG nova.compute.manager [req-785b2972-3136-48ad-9a7a-61ed3d572c4e req-ecca0e1c-631a-4ec2-993c-731a091d73a7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Processing event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.397 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.399 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920462.399714, 9beb3ec3-721e-4919-9713-a92c82ad189b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.400 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.416 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.420 226890 INFO nova.virt.libvirt.driver [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance spawned successfully.#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.420 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.430 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.434 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.441 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.442 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.442 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.442 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.443 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.443 226890 DEBUG nova.virt.libvirt.driver [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.450 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:47:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:42.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.513 226890 INFO nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Took 8.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.513 226890 DEBUG nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.621 226890 INFO nova.compute.manager [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Took 10.78 seconds to build instance.#033[00m
Jan 20 09:47:42 np0005588920 nova_compute[226886]: 2026-01-20 14:47:42.640 226890 DEBUG oslo_concurrency.lockutils [None req-6a154d51-da17-41d4-ac3b-ce0db400ad3a 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:43.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:44.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.115 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.153 226890 DEBUG nova.compute.manager [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.154 226890 DEBUG oslo_concurrency.lockutils [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.154 226890 DEBUG oslo_concurrency.lockutils [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.155 226890 DEBUG oslo_concurrency.lockutils [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.155 226890 DEBUG nova.compute.manager [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.155 226890 WARNING nova.compute.manager [req-484e05a9-b578-4cd6-86a6-85e4e593e367 req-35774c1a-0564-45ba-b042-8ad4476488e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:47:45 np0005588920 nova_compute[226886]: 2026-01-20 14:47:45.658 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:45.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:47:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:46.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:47:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:47.366 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:47 np0005588920 nova_compute[226886]: 2026-01-20 14:47:47.401 226890 DEBUG nova.compute.manager [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:47 np0005588920 nova_compute[226886]: 2026-01-20 14:47:47.402 226890 DEBUG nova.compute.manager [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing instance network info cache due to event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:47 np0005588920 nova_compute[226886]: 2026-01-20 14:47:47.402 226890 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:47 np0005588920 nova_compute[226886]: 2026-01-20 14:47:47.402 226890 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:47 np0005588920 nova_compute[226886]: 2026-01-20 14:47:47.403 226890 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:47.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:47 np0005588920 podman[262938]: 2026-01-20 14:47:47.983871579 +0000 UTC m=+0.061890597 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:47:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:48.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 20 09:47:48 np0005588920 nova_compute[226886]: 2026-01-20 14:47:48.946 226890 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updated VIF entry in instance network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:48 np0005588920 nova_compute[226886]: 2026-01-20 14:47:48.947 226890 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:48 np0005588920 nova_compute[226886]: 2026-01-20 14:47:48.971 226890 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:49.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.971 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.972 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.972 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.972 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.973 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.974 226890 INFO nova.compute.manager [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Terminating instance#033[00m
Jan 20 09:47:49 np0005588920 nova_compute[226886]: 2026-01-20 14:47:49.975 226890 DEBUG nova.compute.manager [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:47:50 np0005588920 kernel: tap5659965f-04 (unregistering): left promiscuous mode
Jan 20 09:47:50 np0005588920 NetworkManager[49076]: <info>  [1768920470.0390] device (tap5659965f-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:50Z|00434|binding|INFO|Releasing lport 5659965f-0485-4982-898c-f273d7898a5f from this chassis (sb_readonly=0)
Jan 20 09:47:50 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:50Z|00435|binding|INFO|Setting lport 5659965f-0485-4982-898c-f273d7898a5f down in Southbound
Jan 20 09:47:50 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:50Z|00436|binding|INFO|Removing iface tap5659965f-04 ovn-installed in OVS
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.055 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:cb:a9 10.100.0.4'], port_security=['fa:16:3e:b7:cb:a9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bf7690ac-9b5a-41e3-83bf-3c83cbacc45c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd5d614-a7d3-4563-913c-104506628e59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b31139b2a4e49cba5e7048febf901c4', 'neutron:revision_number': '7', 'neutron:security_group_ids': '117d6f57-074c-4b36-b375-42e0ab117254', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c42c6982-be52-495a-8746-42a46932572f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5659965f-0485-4982-898c-f273d7898a5f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.057 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5659965f-0485-4982-898c-f273d7898a5f in datapath fbd5d614-a7d3-4563-913c-104506628e59 unbound from our chassis#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.059 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbd5d614-a7d3-4563-913c-104506628e59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.060 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[75108ed1-5724-43e0-9c4c-58e33fb35bcd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.060 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 namespace which is not needed anymore#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.066 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000063.scope: Deactivated successfully.
Jan 20 09:47:50 np0005588920 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000063.scope: Consumed 11.604s CPU time.
Jan 20 09:47:50 np0005588920 systemd-machined[196121]: Machine qemu-45-instance-00000063 terminated.
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.117 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [NOTICE]   (262715) : haproxy version is 2.8.14-c23fe91
Jan 20 09:47:50 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [NOTICE]   (262715) : path to executable is /usr/sbin/haproxy
Jan 20 09:47:50 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [WARNING]  (262715) : Exiting Master process...
Jan 20 09:47:50 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [ALERT]    (262715) : Current worker (262717) exited with code 143 (Terminated)
Jan 20 09:47:50 np0005588920 neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59[262711]: [WARNING]  (262715) : All workers exited. Exiting... (0)
Jan 20 09:47:50 np0005588920 systemd[1]: libpod-02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005.scope: Deactivated successfully.
Jan 20 09:47:50 np0005588920 podman[262980]: 2026-01-20 14:47:50.197437315 +0000 UTC m=+0.052171046 container died 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.207 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.208 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.208 226890 DEBUG nova.network.neutron [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.215 226890 INFO nova.virt.libvirt.driver [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Instance destroyed successfully.#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.215 226890 DEBUG nova.objects.instance [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lazy-loading 'resources' on Instance uuid bf7690ac-9b5a-41e3-83bf-3c83cbacc45c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.225 226890 DEBUG nova.virt.libvirt.vif [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1331472194',display_name='tempest-DeleteServersTestJSON-server-1331472194',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1331472194',id=99,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b31139b2a4e49cba5e7048febf901c4',ramdisk_id='',reservation_id='r-5fm4q1fz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1162922273',owner_user_name='tempest-DeleteServersTestJSON-1162922273-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:38Z,user_data=None,user_id='37e9ef97fbe0448e9fbe32d48b66211f',uuid=bf7690ac-9b5a-41e3-83bf-3c83cbacc45c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.226 226890 DEBUG nova.network.os_vif_util [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converting VIF {"id": "5659965f-0485-4982-898c-f273d7898a5f", "address": "fa:16:3e:b7:cb:a9", "network": {"id": "fbd5d614-a7d3-4563-913c-104506628e59", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-60721994-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b31139b2a4e49cba5e7048febf901c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5659965f-04", "ovs_interfaceid": "5659965f-0485-4982-898c-f273d7898a5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.227 226890 DEBUG nova.network.os_vif_util [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.228 226890 DEBUG os_vif [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.230 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5659965f-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005-userdata-shm.mount: Deactivated successfully.
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.233 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.235 226890 INFO os_vif [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:cb:a9,bridge_name='br-int',has_traffic_filtering=True,id=5659965f-0485-4982-898c-f273d7898a5f,network=Network(fbd5d614-a7d3-4563-913c-104506628e59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5659965f-04')#033[00m
Jan 20 09:47:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay-269d97b051d0a627ede9217472492a1759cef0aba2acb311a14ac2c0c4af46c8-merged.mount: Deactivated successfully.
Jan 20 09:47:50 np0005588920 podman[262980]: 2026-01-20 14:47:50.246164303 +0000 UTC m=+0.100898034 container cleanup 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:47:50 np0005588920 systemd[1]: libpod-conmon-02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005.scope: Deactivated successfully.
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.291 226890 DEBUG nova.compute.manager [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.292 226890 DEBUG oslo_concurrency.lockutils [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.292 226890 DEBUG oslo_concurrency.lockutils [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.292 226890 DEBUG oslo_concurrency.lockutils [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.293 226890 DEBUG nova.compute.manager [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.293 226890 WARNING nova.compute.manager [req-4f566a9b-abe9-4e8c-827c-3cf1946e4c67 req-052f3c47-b413-4d11-812a-1d38450e8d3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-unplugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state active and task_state None.#033[00m
Jan 20 09:47:50 np0005588920 podman[263032]: 2026-01-20 14:47:50.308792829 +0000 UTC m=+0.041645482 container remove 02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.314 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf620e5-20c2-45d6-8a4c-6334ec2102d2]: (4, ('Tue Jan 20 02:47:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005)\n02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005\nTue Jan 20 02:47:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 (02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005)\n02236f43122d6f682dc80eed97255ce80a62371b4074a29b9a2e33655f87a005\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.316 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[145298b2-3188-4846-a941-4f9b47bf2abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.317 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd5d614-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.318 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 kernel: tapfbd5d614-a0: left promiscuous mode
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.337 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b4da81b3-ae08-4928-bc93-9f4ce0a4e126]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.353 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[54b4be91-f92f-4e07-aa62-7f8f79f6dfd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.354 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb74f4c-a922-48af-ae62-1d9311726b85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.370 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4902102f-2645-4156-9bdf-59ecd797b6c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549975, 'reachable_time': 30752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263050, 'error': None, 'target': 'ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 systemd[1]: run-netns-ovnmeta\x2dfbd5d614\x2da7d3\x2d4563\x2d913c\x2d104506628e59.mount: Deactivated successfully.
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.375 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbd5d614-a7d3-4563-913c-104506628e59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:47:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:47:50.375 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[99de33c7-cfba-4fe7-ab74-77720c787cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:50.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.655 226890 INFO nova.virt.libvirt.driver [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Deleting instance files /var/lib/nova/instances/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c_del#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.657 226890 INFO nova.virt.libvirt.driver [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Deletion of /var/lib/nova/instances/bf7690ac-9b5a-41e3-83bf-3c83cbacc45c_del complete#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.665 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.725 226890 INFO nova.compute.manager [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.725 226890 DEBUG oslo.service.loopingcall [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.726 226890 DEBUG nova.compute.manager [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:47:50 np0005588920 nova_compute[226886]: 2026-01-20 14:47:50.726 226890 DEBUG nova.network.neutron [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:47:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:51.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.702 226890 DEBUG nova.network.neutron [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.719 226890 DEBUG nova.network.neutron [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.732 226890 INFO nova.compute.manager [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Took 1.01 seconds to deallocate network for instance.#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.773 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.808 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.809 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.841 226890 DEBUG nova.compute.manager [req-f887a178-e84c-4f6e-8fd9-c867e7700a93 req-15ef12e4-51b8-4bab-bf15-dd30b80dd954 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-deleted-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.944 226890 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.945 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Creating file /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/8603e16117fc4d04877f9a7da5f70a5c.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.946 226890 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/8603e16117fc4d04877f9a7da5f70a5c.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:51 np0005588920 nova_compute[226886]: 2026-01-20 14:47:51.988 226890 DEBUG oslo_concurrency.processutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:47:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8502 writes, 43K keys, 8502 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8502 writes, 8502 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1599 writes, 7940 keys, 1599 commit groups, 1.0 writes per commit group, ingest: 15.68 MB, 0.03 MB/s#012Interval WAL: 1600 writes, 1600 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     74.1      0.69              0.21        24    0.029       0      0       0.0       0.0#012  L6      1/0    9.23 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9    113.2     93.5      2.15              0.73        23    0.093    128K    12K       0.0       0.0#012 Sum      1/0    9.23 MB   0.0      0.2     0.1      0.2       0.2      0.1       0.0   4.9     85.5     88.8      2.85              0.94        47    0.061    128K    12K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.0     99.6     98.9      0.71              0.20        12    0.059     42K   3583       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0    113.2     93.5      2.15              0.73        23    0.093    128K    12K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     74.3      0.69              0.21        23    0.030       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.050, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.08 MB/s write, 0.24 GB read, 0.08 MB/s read, 2.8 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 28.03 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000261 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1620,27.05 MB,8.8985%) FilterBlock(47,366.80 KB,0.117829%) IndexBlock(47,634.67 KB,0.203881%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:47:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/919004951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.404 226890 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/8603e16117fc4d04877f9a7da5f70a5c.tmp" returned: 1 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.405 226890 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/8603e16117fc4d04877f9a7da5f70a5c.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.405 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Creating directory /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.405 226890 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.431 226890 DEBUG oslo_concurrency.processutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.433 226890 DEBUG nova.compute.manager [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.434 226890 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.434 226890 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.434 226890 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.435 226890 DEBUG nova.compute.manager [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] No waiting events found dispatching network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.435 226890 WARNING nova.compute.manager [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Received unexpected event network-vif-plugged-5659965f-0485-4982-898c-f273d7898a5f for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.441 226890 DEBUG nova.compute.provider_tree [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.454 226890 DEBUG nova.scheduler.client.report [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:52.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.474 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.505 226890 INFO nova.scheduler.client.report [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Deleted allocations for instance bf7690ac-9b5a-41e3-83bf-3c83cbacc45c#033[00m
Jan 20 09:47:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.605 226890 DEBUG oslo_concurrency.lockutils [None req-78c57d6e-f9b9-4f93-bbb4-c780048e481a 37e9ef97fbe0448e9fbe32d48b66211f 3b31139b2a4e49cba5e7048febf901c4 - - default default] Lock "bf7690ac-9b5a-41e3-83bf-3c83cbacc45c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.606 226890 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:52 np0005588920 nova_compute[226886]: 2026-01-20 14:47:52.611 226890 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:47:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:53.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:54.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:55 np0005588920 nova_compute[226886]: 2026-01-20 14:47:55.233 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:55 np0005588920 nova_compute[226886]: 2026-01-20 14:47:55.662 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:55.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:56.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:57.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:58.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:59Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:47:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:47:59Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:47:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 20 09:47:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:47:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:59.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:00 np0005588920 nova_compute[226886]: 2026-01-20 14:48:00.237 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:00.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:00 np0005588920 nova_compute[226886]: 2026-01-20 14:48:00.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:01.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:02.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:02 np0005588920 nova_compute[226886]: 2026-01-20 14:48:02.655 226890 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:48:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:03.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:04.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:04Z|00437|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:48:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:04Z|00438|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.214 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920470.2125762, bf7690ac-9b5a-41e3-83bf-3c83cbacc45c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.214 226890 INFO nova.compute.manager [-] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.239 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.249 226890 DEBUG nova.compute.manager [None req-9ca3d3bf-3c4e-4fc2-bd4c-65e56b1c1bb7 - - - - - -] [instance: bf7690ac-9b5a-41e3-83bf-3c83cbacc45c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:05 np0005588920 nova_compute[226886]: 2026-01-20 14:48:05.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:05.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:06 np0005588920 kernel: tapefc8b363-e7 (unregistering): left promiscuous mode
Jan 20 09:48:06 np0005588920 NetworkManager[49076]: <info>  [1768920486.0577] device (tapefc8b363-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.066 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:06Z|00439|binding|INFO|Releasing lport efc8b363-e70d-42f6-9be8-99865e269ec9 from this chassis (sb_readonly=0)
Jan 20 09:48:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:06Z|00440|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 down in Southbound
Jan 20 09:48:06 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:06Z|00441|binding|INFO|Removing iface tapefc8b363-e7 ovn-installed in OVS
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.072 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:66:1d 10.100.0.8'], port_security=['fa:16:3e:36:66:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=efc8b363-e70d-42f6-9be8-99865e269ec9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.074 144128 INFO neutron.agent.ovn.metadata.agent [-] Port efc8b363-e70d-42f6-9be8-99865e269ec9 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.075 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a19e9d1a-864f-41ee-bdea-188e65973ea5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.076 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ce7e3b-c5ce-414d-a56b-33b567be4204]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.077 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace which is not needed anymore#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.081 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 20 09:48:06 np0005588920 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000065.scope: Consumed 14.371s CPU time.
Jan 20 09:48:06 np0005588920 systemd-machined[196121]: Machine qemu-46-instance-00000065 terminated.
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [NOTICE]   (262927) : haproxy version is 2.8.14-c23fe91
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [NOTICE]   (262927) : path to executable is /usr/sbin/haproxy
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [WARNING]  (262927) : Exiting Master process...
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [WARNING]  (262927) : Exiting Master process...
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [ALERT]    (262927) : Current worker (262929) exited with code 143 (Terminated)
Jan 20 09:48:06 np0005588920 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[262923]: [WARNING]  (262927) : All workers exited. Exiting... (0)
Jan 20 09:48:06 np0005588920 systemd[1]: libpod-7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38.scope: Deactivated successfully.
Jan 20 09:48:06 np0005588920 podman[263099]: 2026-01-20 14:48:06.219059563 +0000 UTC m=+0.049887892 container died 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:48:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38-userdata-shm.mount: Deactivated successfully.
Jan 20 09:48:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay-77021e52f09fab56d798e398bd9c68349b9a361e1e99edb6e6e5b24031504467-merged.mount: Deactivated successfully.
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.286 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 podman[263099]: 2026-01-20 14:48:06.293162438 +0000 UTC m=+0.123990727 container cleanup 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.294 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 systemd[1]: libpod-conmon-7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38.scope: Deactivated successfully.
Jan 20 09:48:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:48:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:06.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:48:06 np0005588920 podman[263137]: 2026-01-20 14:48:06.573138493 +0000 UTC m=+0.258873697 container remove 7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.583 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbe00c9-c607-4158-8762-3bdf20c85cb5]: (4, ('Tue Jan 20 02:48:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38)\n7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38\nTue Jan 20 02:48:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38)\n7e351b44d379e0cf2522819975195a1ca599b0fba9d65a4eadea3a3172553e38\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.585 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44ed93a2-78b6-46b4-9cd4-9a29b4764d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.586 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.634 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 kernel: tapa19e9d1a-80: left promiscuous mode
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.652 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.655 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ccec9ae2-b829-4436-a8a3-82f90a98425c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.669 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e682e8b3-83f4-4dbd-be5b-6edd14779ae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.670 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[99de7f84-04a5-4417-a23c-4c20be77d1e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.674 226890 INFO nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance shutdown successfully after 14 seconds.#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.679 226890 INFO nova.virt.libvirt.driver [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance destroyed successfully.#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.680 226890 DEBUG nova.virt.libvirt.vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.680 226890 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.681 226890 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.681 226890 DEBUG os_vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.682 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.683 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefc8b363-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.686 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6b0f4c-308c-4f19-a9db-edf0254cf8a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550328, 'reachable_time': 29087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263155, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.688 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.689 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:48:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:06.689 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[b3894243-2d2c-4b01-bd53-c8d0f2eccfdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:06 np0005588920 systemd[1]: run-netns-ovnmeta\x2da19e9d1a\x2d864f\x2d41ee\x2dbdea\x2d188e65973ea5.mount: Deactivated successfully.
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.691 226890 INFO os_vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7')#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.696 226890 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.697 226890 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.853 226890 DEBUG nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.854 226890 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.854 226890 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.854 226890 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.854 226890 DEBUG nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:06 np0005588920 nova_compute[226886]: 2026-01-20 14:48:06.855 226890 WARNING nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:48:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:07.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:07 np0005588920 nova_compute[226886]: 2026-01-20 14:48:07.937 226890 DEBUG neutronclient.v2_0.client [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port efc8b363-e70d-42f6-9be8-99865e269ec9 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.048 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.049 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.049 226890 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:08.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.979 226890 DEBUG nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.979 226890 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.979 226890 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.980 226890 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.980 226890 DEBUG nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:08 np0005588920 nova_compute[226886]: 2026-01-20 14:48:08.980 226890 WARNING nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:48:09 np0005588920 podman[263156]: 2026-01-20 14:48:09.001541477 +0000 UTC m=+0.090402251 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:09.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:10.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:10 np0005588920 nova_compute[226886]: 2026-01-20 14:48:10.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.085 226890 DEBUG nova.compute.manager [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.086 226890 DEBUG nova.compute.manager [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing instance network info cache due to event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.086 226890 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.086 226890 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.087 226890 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:48:11 np0005588920 nova_compute[226886]: 2026-01-20 14:48:11.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:11.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:12.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:13 np0005588920 nova_compute[226886]: 2026-01-20 14:48:13.388 226890 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updated VIF entry in instance network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:48:13 np0005588920 nova_compute[226886]: 2026-01-20 14:48:13.388 226890 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:13 np0005588920 nova_compute[226886]: 2026-01-20 14:48:13.411 226890 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:13.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:14.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.538052) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494538109, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 699, "num_deletes": 258, "total_data_size": 1110707, "memory_usage": 1128944, "flush_reason": "Manual Compaction"}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494544692, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 732123, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43191, "largest_seqno": 43885, "table_properties": {"data_size": 728727, "index_size": 1240, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8094, "raw_average_key_size": 19, "raw_value_size": 721713, "raw_average_value_size": 1702, "num_data_blocks": 55, "num_entries": 424, "num_filter_entries": 424, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920455, "oldest_key_time": 1768920455, "file_creation_time": 1768920494, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 6669 microseconds, and 2329 cpu microseconds.
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.544720) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 732123 bytes OK
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.544733) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.546848) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.546861) EVENT_LOG_v1 {"time_micros": 1768920494546857, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.546875) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1106864, prev total WAL file size 1106864, number of live WAL files 2.
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.547342) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323631' seq:72057594037927935, type:22 .. '6C6F676D0031353134' seq:0, type:0; will stop at (end)
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(714KB)], [81(9449KB)]
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494547404, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 10408658, "oldest_snapshot_seqno": -1}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6813 keys, 10279389 bytes, temperature: kUnknown
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494612046, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10279389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10234852, "index_size": 26377, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17093, "raw_key_size": 175686, "raw_average_key_size": 25, "raw_value_size": 10113971, "raw_average_value_size": 1484, "num_data_blocks": 1049, "num_entries": 6813, "num_filter_entries": 6813, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920494, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.612350) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10279389 bytes
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.631775) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.7 rd, 158.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(28.3) write-amplify(14.0) OK, records in: 7342, records dropped: 529 output_compression: NoCompression
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.631819) EVENT_LOG_v1 {"time_micros": 1768920494631802, "job": 50, "event": "compaction_finished", "compaction_time_micros": 64767, "compaction_time_cpu_micros": 23878, "output_level": 6, "num_output_files": 1, "total_output_size": 10279389, "num_input_records": 7342, "num_output_records": 6813, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494632160, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494634324, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.547229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.634392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.634397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.634399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.634401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:14.634402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.705 226890 DEBUG nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.706 226890 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.706 226890 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.706 226890 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.706 226890 DEBUG nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:14 np0005588920 nova_compute[226886]: 2026-01-20 14:48:14.707 226890 WARNING nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 09:48:15 np0005588920 nova_compute[226886]: 2026-01-20 14:48:15.459 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:15.583 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:15.584 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:48:15 np0005588920 nova_compute[226886]: 2026-01-20 14:48:15.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:15 np0005588920 nova_compute[226886]: 2026-01-20 14:48:15.670 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:15.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:16.452 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:16 np0005588920 nova_compute[226886]: 2026-01-20 14:48:16.686 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.453 226890 DEBUG nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.455 226890 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.456 226890 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.456 226890 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.457 226890 DEBUG nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.457 226890 WARNING nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.526 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.527 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:17 np0005588920 nova_compute[226886]: 2026-01-20 14:48:17.527 226890 DEBUG nova.compute.manager [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 09:48:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:17.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:18.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:18 np0005588920 podman[263187]: 2026-01-20 14:48:18.966087368 +0000 UTC m=+0.058932704 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 20 09:48:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:19.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:19 np0005588920 nova_compute[226886]: 2026-01-20 14:48:19.910 226890 DEBUG neutronclient.v2_0.client [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port efc8b363-e70d-42f6-9be8-99865e269ec9 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 09:48:19 np0005588920 nova_compute[226886]: 2026-01-20 14:48:19.911 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:19 np0005588920 nova_compute[226886]: 2026-01-20 14:48:19.911 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:19 np0005588920 nova_compute[226886]: 2026-01-20 14:48:19.912 226890 DEBUG nova.network.neutron [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:48:19 np0005588920 nova_compute[226886]: 2026-01-20 14:48:19.912 226890 DEBUG nova.objects.instance [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'info_cache' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:20.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:20 np0005588920 nova_compute[226886]: 2026-01-20 14:48:20.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588920 nova_compute[226886]: 2026-01-20 14:48:20.902 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.293 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920486.2924094, 9beb3ec3-721e-4919-9713-a92c82ad189b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.294 226890 INFO nova.compute.manager [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.328 226890 DEBUG nova.compute.manager [None req-6b862e88-ba49-4361-b66f-e3af168a26ac - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.331 226890 DEBUG nova.compute.manager [None req-6b862e88-ba49-4361-b66f-e3af168a26ac - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.384 226890 INFO nova.compute.manager [None req-6b862e88-ba49-4361-b66f-e3af168a26ac - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 20 09:48:21 np0005588920 nova_compute[226886]: 2026-01-20 14:48:21.697 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:21.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:22.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.341 226890 DEBUG nova.network.neutron [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.381 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.381 226890 DEBUG nova.objects.instance [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.432 226890 DEBUG nova.storage.rbd_utils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 9beb3ec3-721e-4919-9713-a92c82ad189b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.440 226890 DEBUG nova.virt.libvirt.vif [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.440 226890 DEBUG nova.network.os_vif_util [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.441 226890 DEBUG nova.network.os_vif_util [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.441 226890 DEBUG os_vif [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.443 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.443 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefc8b363-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.443 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.445 226890 INFO os_vif [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7')#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.445 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.445 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:48:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 35K writes, 144K keys, 35K commit groups, 1.0 writes per commit group, ingest: 0.14 GB, 0.05 MB/s#012Cumulative WAL: 35K writes, 12K syncs, 2.84 writes per sync, written: 0.14 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 46.89 MB, 0.08 MB/s#012Interval WAL: 10K writes, 3945 syncs, 2.66 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:48:23 np0005588920 nova_compute[226886]: 2026-01-20 14:48:23.558 226890 DEBUG oslo_concurrency.processutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:23.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/624077283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.002 226890 DEBUG oslo_concurrency.processutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.008 226890 DEBUG nova.compute.provider_tree [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.027 226890 DEBUG nova.scheduler.client.report [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.073 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.202 226890 INFO nova.scheduler.client.report [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocation for migration d75f7553-0bf9-4277-b1f7-34600960db53#033[00m
Jan 20 09:48:24 np0005588920 nova_compute[226886]: 2026-01-20 14:48:24.359 226890 DEBUG oslo_concurrency.lockutils [None req-9c3a30d8-3b88-43d1-b6ca-785f27e39617 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:24.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:24.585 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:25 np0005588920 nova_compute[226886]: 2026-01-20 14:48:25.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:25.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:26.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:26 np0005588920 nova_compute[226886]: 2026-01-20 14:48:26.699 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:27.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:28.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:28 np0005588920 nova_compute[226886]: 2026-01-20 14:48:28.691 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:29 np0005588920 nova_compute[226886]: 2026-01-20 14:48:29.232 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:29.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:30.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:30 np0005588920 nova_compute[226886]: 2026-01-20 14:48:30.676 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:48:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:48:31 np0005588920 nova_compute[226886]: 2026-01-20 14:48:31.701 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:31.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:31 np0005588920 nova_compute[226886]: 2026-01-20 14:48:31.771 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:31 np0005588920 nova_compute[226886]: 2026-01-20 14:48:31.771 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:48:31 np0005588920 nova_compute[226886]: 2026-01-20 14:48:31.771 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:48:32 np0005588920 nova_compute[226886]: 2026-01-20 14:48:32.062 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:32 np0005588920 nova_compute[226886]: 2026-01-20 14:48:32.063 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:32 np0005588920 nova_compute[226886]: 2026-01-20 14:48:32.063 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:48:32 np0005588920 nova_compute[226886]: 2026-01-20 14:48:32.063 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:32.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:33.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.673 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.678 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.697 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.697 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.697 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:35.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.751 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.751 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:35 np0005588920 nova_compute[226886]: 2026-01-20 14:48:35.896 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/851343660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.167 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.248 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.248 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.391 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.392 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4356MB free_disk=20.849884033203125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.392 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.393 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.465 226890 INFO nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating resource usage from migration 37b266a8-8f13-40bc-ab16-470d7fe422ef#033[00m
Jan 20 09:48:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.614 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Migration 37b266a8-8f13-40bc-ab16-470d7fe422ef is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.615 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.615 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.703 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.792 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.853 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.853 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:48:36 np0005588920 nova_compute[226886]: 2026-01-20 14:48:36.990 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.044 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:48:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.128 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/821706828' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.537 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.543 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.567 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.607 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:48:37 np0005588920 nova_compute[226886]: 2026-01-20 14:48:37.608 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:37.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.160 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.161 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.161 226890 DEBUG nova.network.neutron [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:48:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:38.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.608 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.609 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.610 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.610 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.610 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:38 np0005588920 nova_compute[226886]: 2026-01-20 14:48:38.752 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:39 np0005588920 nova_compute[226886]: 2026-01-20 14:48:39.746 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:39 np0005588920 nova_compute[226886]: 2026-01-20 14:48:39.747 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:48:39 np0005588920 nova_compute[226886]: 2026-01-20 14:48:39.775 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:48:39 np0005588920 nova_compute[226886]: 2026-01-20 14:48:39.775 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:39 np0005588920 nova_compute[226886]: 2026-01-20 14:48:39.775 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:48:40 np0005588920 podman[263473]: 2026-01-20 14:48:40.055121206 +0000 UTC m=+0.132099563 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:48:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000055s ======
Jan 20 09:48:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:40.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 20 09:48:40 np0005588920 nova_compute[226886]: 2026-01-20 14:48:40.680 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.114 226890 DEBUG nova.network.neutron [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.131 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.238 226890 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.239 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Creating file /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/3e8c128d65b74363a5d890b3774d11fa.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.240 226890 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/3e8c128d65b74363a5d890b3774d11fa.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.741 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:41.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.748 226890 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/3e8c128d65b74363a5d890b3774d11fa.tmp" returned: 1 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.749 226890 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/3e8c128d65b74363a5d890b3774d11fa.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.749 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Creating directory /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.749 226890 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.786 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.787 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.941 226890 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:41 np0005588920 nova_compute[226886]: 2026-01-20 14:48:41.944 226890 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.165903) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522165924, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 611, "num_deletes": 251, "total_data_size": 885578, "memory_usage": 896888, "flush_reason": "Manual Compaction"}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522171064, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 572733, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43890, "largest_seqno": 44496, "table_properties": {"data_size": 569651, "index_size": 990, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7616, "raw_average_key_size": 19, "raw_value_size": 563367, "raw_average_value_size": 1444, "num_data_blocks": 43, "num_entries": 390, "num_filter_entries": 390, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920495, "oldest_key_time": 1768920495, "file_creation_time": 1768920522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 5193 microseconds, and 2261 cpu microseconds.
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.171095) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 572733 bytes OK
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.171108) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.173053) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.173064) EVENT_LOG_v1 {"time_micros": 1768920522173061, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.173077) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 882106, prev total WAL file size 882106, number of live WAL files 2.
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.173507) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(559KB)], [84(10038KB)]
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522173556, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 10852122, "oldest_snapshot_seqno": -1}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6689 keys, 8960655 bytes, temperature: kUnknown
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522250513, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 8960655, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8918017, "index_size": 24798, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 173864, "raw_average_key_size": 25, "raw_value_size": 8800335, "raw_average_value_size": 1315, "num_data_blocks": 976, "num_entries": 6689, "num_filter_entries": 6689, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.250903) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 8960655 bytes
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252382) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.7 rd, 116.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.8 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(34.6) write-amplify(15.6) OK, records in: 7203, records dropped: 514 output_compression: NoCompression
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252415) EVENT_LOG_v1 {"time_micros": 1768920522252400, "job": 52, "event": "compaction_finished", "compaction_time_micros": 77116, "compaction_time_cpu_micros": 21339, "output_level": 6, "num_output_files": 1, "total_output_size": 8960655, "num_input_records": 7203, "num_output_records": 6689, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522252755, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522256379, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.173430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.256414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.256419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.256422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.256424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:48:42.256427) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:42.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:43.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:44 np0005588920 kernel: tapd3a9a684-c9 (unregistering): left promiscuous mode
Jan 20 09:48:44 np0005588920 NetworkManager[49076]: <info>  [1768920524.1410] device (tapd3a9a684-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:44Z|00442|binding|INFO|Releasing lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 from this chassis (sb_readonly=0)
Jan 20 09:48:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:44Z|00443|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 down in Southbound
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:48:44Z|00444|binding|INFO|Removing iface tapd3a9a684-c9 ovn-installed in OVS
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.155 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.157 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.158 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.159 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[679f2400-a52a-495d-b5ee-b86f4f775b89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.160 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.170 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:44 np0005588920 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 20 09:48:44 np0005588920 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005e.scope: Consumed 16.521s CPU time.
Jan 20 09:48:44 np0005588920 systemd-machined[196121]: Machine qemu-44-instance-0000005e terminated.
Jan 20 09:48:44 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [NOTICE]   (261993) : haproxy version is 2.8.14-c23fe91
Jan 20 09:48:44 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [NOTICE]   (261993) : path to executable is /usr/sbin/haproxy
Jan 20 09:48:44 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [WARNING]  (261993) : Exiting Master process...
Jan 20 09:48:44 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [ALERT]    (261993) : Current worker (261995) exited with code 143 (Terminated)
Jan 20 09:48:44 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[261989]: [WARNING]  (261993) : All workers exited. Exiting... (0)
Jan 20 09:48:44 np0005588920 systemd[1]: libpod-9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f.scope: Deactivated successfully.
Jan 20 09:48:44 np0005588920 podman[263524]: 2026-01-20 14:48:44.294923533 +0000 UTC m=+0.043435901 container died 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:48:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f-userdata-shm.mount: Deactivated successfully.
Jan 20 09:48:44 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d27fe4a6de03d9b0a6b5830b438afe51369c32f43e19de3a2a41c5bd986d7b15-merged.mount: Deactivated successfully.
Jan 20 09:48:44 np0005588920 podman[263524]: 2026-01-20 14:48:44.339555228 +0000 UTC m=+0.088067596 container cleanup 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:44 np0005588920 systemd[1]: libpod-conmon-9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f.scope: Deactivated successfully.
Jan 20 09:48:44 np0005588920 podman[263555]: 2026-01-20 14:48:44.418433356 +0000 UTC m=+0.060720423 container remove 9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.423 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe928a9-bbda-4e37-8d95-bd391dfbafb5]: (4, ('Tue Jan 20 02:48:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f)\n9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f\nTue Jan 20 02:48:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f)\n9c31fcb32c3511de32cd28492aec2f6327f8019e3b1262760138ccc65c30546f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.425 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0eac74-ce2b-4eb9-a68f-582bab380207]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.426 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:44 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.443 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.446 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[97b85590-7913-436b-99b6-665b7bd93474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.465 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c60eb930-4989-4f54-b2ee-c5a160a76abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.467 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f336264-97e0-4983-87b5-8e8cd8320d5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.480 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ad929101-a6e3-458f-9f3e-9f3c1fc1ba2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546608, 'reachable_time': 40505, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263582, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.484 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:48:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:48:44.484 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[7b32a379-fc5c-4ba4-9295-5700860dd048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:44.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.961 226890 INFO nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.965 226890 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.966 226890 DEBUG nova.virt.libvirt.vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.967 226890 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.968 226890 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.968 226890 DEBUG os_vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:44 np0005588920 nova_compute[226886]: 2026-01-20 14:48:44.970 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.016 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.018 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.020 226890 INFO os_vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.023 226890 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.023 226890 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.057 226890 DEBUG nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.057 226890 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.057 226890 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.058 226890 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.058 226890 DEBUG nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.058 226890 WARNING nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.468 226890 DEBUG neutronclient.v2_0.client [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.611 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.612 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.612 226890 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:45 np0005588920 nova_compute[226886]: 2026-01-20 14:48:45.682 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:45.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2739244399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:46.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.331 226890 DEBUG nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.331 226890 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.332 226890 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.332 226890 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.332 226890 DEBUG nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.333 226890 WARNING nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.392 226890 DEBUG nova.compute.manager [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.392 226890 DEBUG nova.compute.manager [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing instance network info cache due to event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.393 226890 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.393 226890 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:47 np0005588920 nova_compute[226886]: 2026-01-20 14:48:47.393 226890 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:48:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:47.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:48.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:49.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:50 np0005588920 podman[263583]: 2026-01-20 14:48:50.006120929 +0000 UTC m=+0.086447590 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:48:50 np0005588920 nova_compute[226886]: 2026-01-20 14:48:50.016 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:50 np0005588920 nova_compute[226886]: 2026-01-20 14:48:50.436 226890 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated VIF entry in instance network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:48:50 np0005588920 nova_compute[226886]: 2026-01-20 14:48:50.437 226890 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:50 np0005588920 nova_compute[226886]: 2026-01-20 14:48:50.470 226890 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:50.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:50 np0005588920 nova_compute[226886]: 2026-01-20 14:48:50.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 20 09:48:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:51.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:52.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:53.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.908 226890 DEBUG nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.910 226890 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.910 226890 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.910 226890 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.911 226890 DEBUG nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:53 np0005588920 nova_compute[226886]: 2026-01-20 14:48:53.911 226890 WARNING nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 09:48:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:54.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:55 np0005588920 nova_compute[226886]: 2026-01-20 14:48:55.018 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:55 np0005588920 nova_compute[226886]: 2026-01-20 14:48:55.408 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:55 np0005588920 nova_compute[226886]: 2026-01-20 14:48:55.408 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:55 np0005588920 nova_compute[226886]: 2026-01-20 14:48:55.409 226890 DEBUG nova.compute.manager [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 09:48:55 np0005588920 nova_compute[226886]: 2026-01-20 14:48:55.687 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:48:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:55.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.082 226890 DEBUG neutronclient.v2_0.client [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.083 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.083 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.083 226890 DEBUG nova.network.neutron [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.083 226890 DEBUG nova.objects.instance [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'info_cache' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.311 226890 DEBUG nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.312 226890 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.312 226890 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.312 226890 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.312 226890 DEBUG nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:56 np0005588920 nova_compute[226886]: 2026-01-20 14:48:56.313 226890 WARNING nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:48:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:56.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:57.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:58.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:59 np0005588920 nova_compute[226886]: 2026-01-20 14:48:59.387 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920524.386758, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:59 np0005588920 nova_compute[226886]: 2026-01-20 14:48:59.388 226890 INFO nova.compute.manager [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:59 np0005588920 nova_compute[226886]: 2026-01-20 14:48:59.415 226890 DEBUG nova.compute.manager [None req-59033f8f-e48d-44de-bd7e-e0f7a125ad95 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:59 np0005588920 nova_compute[226886]: 2026-01-20 14:48:59.418 226890 DEBUG nova.compute.manager [None req-59033f8f-e48d-44de-bd7e-e0f7a125ad95 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:59 np0005588920 nova_compute[226886]: 2026-01-20 14:48:59.458 226890 INFO nova.compute.manager [None req-59033f8f-e48d-44de-bd7e-e0f7a125ad95 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 20 09:48:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:48:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:59.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.350 226890 DEBUG nova.network.neutron [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.378 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.378 226890 DEBUG nova.objects.instance [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.482 226890 DEBUG nova.storage.rbd_utils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] removing snapshot(nova-resize) on rbd image(75736b87-b14e-45b7-b43b-5129cf7d3279_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:49:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:00.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:00 np0005588920 nova_compute[226886]: 2026-01-20 14:49:00.728 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.187 226890 DEBUG nova.virt.libvirt.vif [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.188 226890 DEBUG nova.network.os_vif_util [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.189 226890 DEBUG nova.network.os_vif_util [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.189 226890 DEBUG os_vif [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.190 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.191 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.191 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.193 226890 INFO os_vif [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.193 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.193 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:01.236 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:01.237 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.237 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.289 226890 DEBUG oslo_concurrency.processutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.688 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:01.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4043656728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.801 226890 DEBUG oslo_concurrency.processutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.807 226890 DEBUG nova.compute.provider_tree [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.892 226890 DEBUG nova.scheduler.client.report [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:01 np0005588920 nova_compute[226886]: 2026-01-20 14:49:01.942 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:02 np0005588920 nova_compute[226886]: 2026-01-20 14:49:02.003 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:02 np0005588920 nova_compute[226886]: 2026-01-20 14:49:02.116 226890 INFO nova.scheduler.client.report [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Deleted allocation for migration 37b266a8-8f13-40bc-ab16-470d7fe422ef#033[00m
Jan 20 09:49:02 np0005588920 nova_compute[226886]: 2026-01-20 14:49:02.202 226890 DEBUG oslo_concurrency.lockutils [None req-d933618b-302d-4921-b095-6bb73cf107f2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:02.239 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:02.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:03.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:04.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:05 np0005588920 nova_compute[226886]: 2026-01-20 14:49:05.022 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:05 np0005588920 nova_compute[226886]: 2026-01-20 14:49:05.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:49:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:05.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:49:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:06.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:07.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:08.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:08 np0005588920 nova_compute[226886]: 2026-01-20 14:49:08.996 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 20 09:49:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:09.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:10 np0005588920 nova_compute[226886]: 2026-01-20 14:49:10.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:49:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:10.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:49:10 np0005588920 nova_compute[226886]: 2026-01-20 14:49:10.775 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:10 np0005588920 podman[263663]: 2026-01-20 14:49:10.988259479 +0000 UTC m=+0.076136573 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:49:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:11.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:12.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:49:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/95823729' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:49:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:49:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/95823729' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:49:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:13.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:14.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:15 np0005588920 nova_compute[226886]: 2026-01-20 14:49:15.027 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:15 np0005588920 nova_compute[226886]: 2026-01-20 14:49:15.777 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:15.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:16.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:17.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:18.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:19.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:20 np0005588920 nova_compute[226886]: 2026-01-20 14:49:20.071 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:20.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:20 np0005588920 nova_compute[226886]: 2026-01-20 14:49:20.778 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:20 np0005588920 podman[263691]: 2026-01-20 14:49:20.989350921 +0000 UTC m=+0.073372147 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 20 09:49:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:21.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:22.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:23.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:24.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:25 np0005588920 nova_compute[226886]: 2026-01-20 14:49:25.073 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:25 np0005588920 nova_compute[226886]: 2026-01-20 14:49:25.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:25.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:25 np0005588920 nova_compute[226886]: 2026-01-20 14:49:25.915 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:25 np0005588920 nova_compute[226886]: 2026-01-20 14:49:25.916 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:25 np0005588920 nova_compute[226886]: 2026-01-20 14:49:25.951 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.039 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.039 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.046 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.046 226890 INFO nova.compute.claims [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.214 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:26.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.687 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.692 226890 DEBUG nova.compute.provider_tree [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.714 226890 DEBUG nova.scheduler.client.report [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.750 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.751 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.869 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.870 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.901 226890 INFO nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:49:26 np0005588920 nova_compute[226886]: 2026-01-20 14:49:26.995 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.053 226890 INFO nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Booting with volume 8cd2ec74-aafb-4e12-a845-44b6fe96ba18 at /dev/vda#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.339 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.340 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.363 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.363 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[fe882a8e-d9cf-4d32-86e2-96bc69c2618a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.365 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.372 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.372 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3a48e53b-dc6b-4b25-b85c-9363d93452fa]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.374 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.381 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.381 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba0dd4b-8206-47f3-b26b-8a25392e2141]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.382 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3c248436-0f9e-488d-a534-d6e19950df05]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.383 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.405 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.407 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.408 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.408 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.408 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.408 226890 DEBUG nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating existing volume attachment record: 1c8151c5-460c-482c-8032-494f5d7b3a37 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:49:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:27.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:27 np0005588920 nova_compute[226886]: 2026-01-20 14:49:27.979 226890 DEBUG nova.policy [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d45e7e42e6d419898780db108ff93ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:49:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:28.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:28 np0005588920 nova_compute[226886]: 2026-01-20 14:49:28.810 226890 INFO nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Booting with volume 25d4c4a0-3582-454d-9d4a-312b5c351d9d at /dev/vdb#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.042 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.043 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.052 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.052 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[0a17f532-2e5b-48e5-96f6-2e0fb871a031]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.054 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.060 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.061 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6e07819a-0657-4eb0-84dc-4b88bb56e1e4]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.062 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.069 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.070 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ce9355-09fd-4cc6-baf1-403945c9db64]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.071 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9877e4-5169-4201-83b7-a62c9b38be05]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.071 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.092 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.094 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.094 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.095 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.095 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] <== get_connector_properties: return (51ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.095 226890 DEBUG nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating existing volume attachment record: bb01cd80-4d51-458e-a034-095eecb4b63c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:49:29 np0005588920 nova_compute[226886]: 2026-01-20 14:49:29.585 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully created port: ed97bbce-18dc-4c9b-9a04-919dd3a45a8e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:29.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4003458176' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.074 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.359 226890 INFO nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Booting with volume 9a6aba77-de94-4bfe-8062-70d79455ddbe at /dev/vdc#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.522 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.523 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.536 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.536 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[c054cf85-b601-43e3-883e-69d5d64c3957]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.537 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.544 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.544 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[a0572444-416e-4cb3-b6f2-3ee25bbb5770]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.546 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.555 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.556 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[22890217-e6a7-4998-b5c7-fe74ed1c8fb4]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.558 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[1edef72e-fb1c-44d8-9338-1d76f59edddb]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.558 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.587 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.590 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.590 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.590 226890 DEBUG os_brick.initiator.connectors.lightos [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.591 226890 DEBUG os_brick.utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.591 226890 DEBUG nova.virt.block_device [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating existing volume attachment record: a8507e5b-0e79-4bf9-bb87-c0c2cd4679ea _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:49:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:30.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:30 np0005588920 nova_compute[226886]: 2026-01-20 14:49:30.808 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully created port: 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4060347495' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.747 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.748 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.748 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.784 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.784 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:49:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:31.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.919 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.922 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.923 226890 INFO nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Creating image(s)#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.924 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.924 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Ensure instance console log exists: /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.925 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.926 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:31 np0005588920 nova_compute[226886]: 2026-01-20 14:49:31.926 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:32 np0005588920 nova_compute[226886]: 2026-01-20 14:49:32.292 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully created port: d2be8515-193f-43f4-bae4-d2a509320929 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:32.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:33.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:34 np0005588920 nova_compute[226886]: 2026-01-20 14:49:34.386 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully created port: e10436e2-7916-4b6b-905e-e9be7cb338b9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:34.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:35 np0005588920 nova_compute[226886]: 2026-01-20 14:49:35.076 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:35 np0005588920 nova_compute[226886]: 2026-01-20 14:49:35.776 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully created port: ea69e1af-9543-4c76-9981-b8475aa031fe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:35 np0005588920 nova_compute[226886]: 2026-01-20 14:49:35.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:35.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:36.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.760 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:49:36 np0005588920 nova_compute[226886]: 2026-01-20 14:49:36.760 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1764439710' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.195 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.350 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.351 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4547MB free_disk=20.935195922851562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.351 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.351 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.468 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance f444ccf6-5adb-489a-b174-7450017a351b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.469 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.469 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.518 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:37.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2922371658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.949 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.956 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:37 np0005588920 nova_compute[226886]: 2026-01-20 14:49:37.986 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:49:38 np0005588920 nova_compute[226886]: 2026-01-20 14:49:38.018 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:49:38 np0005588920 nova_compute[226886]: 2026-01-20 14:49:38.019 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:38.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:49:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:49:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:49:39 np0005588920 nova_compute[226886]: 2026-01-20 14:49:39.019 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:39 np0005588920 nova_compute[226886]: 2026-01-20 14:49:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:39 np0005588920 nova_compute[226886]: 2026-01-20 14:49:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:39 np0005588920 nova_compute[226886]: 2026-01-20 14:49:39.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:49:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:39.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:39 np0005588920 nova_compute[226886]: 2026-01-20 14:49:39.929 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: ed97bbce-18dc-4c9b-9a04-919dd3a45a8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.060 226890 DEBUG nova.compute.manager [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.060 226890 DEBUG nova.compute.manager [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.061 226890 DEBUG oslo_concurrency.lockutils [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.061 226890 DEBUG oslo_concurrency.lockutils [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.061 226890 DEBUG nova.network.neutron [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port ed97bbce-18dc-4c9b-9a04-919dd3a45a8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.079 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:40.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:40 np0005588920 nova_compute[226886]: 2026-01-20 14:49:40.822 226890 DEBUG nova.network.neutron [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:41 np0005588920 nova_compute[226886]: 2026-01-20 14:49:41.622 226890 DEBUG nova.network.neutron [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:41 np0005588920 nova_compute[226886]: 2026-01-20 14:49:41.681 226890 DEBUG oslo_concurrency.lockutils [req-f2dceeef-6bce-4e9e-af65-ab6e9475ff59 req-6ad6fadd-a73f-4313-8b79-60e1454922c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:41.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:42 np0005588920 podman[264049]: 2026-01-20 14:49:42.031254323 +0000 UTC m=+0.108951878 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.476 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: b194a444-cc69-43f2-9931-e9e53ee450c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:42.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.630 226890 DEBUG nova.compute.manager [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-b194a444-cc69-43f2-9931-e9e53ee450c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.630 226890 DEBUG nova.compute.manager [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-b194a444-cc69-43f2-9931-e9e53ee450c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.630 226890 DEBUG oslo_concurrency.lockutils [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.630 226890 DEBUG oslo_concurrency.lockutils [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.630 226890 DEBUG nova.network.neutron [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port b194a444-cc69-43f2-9931-e9e53ee450c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:42 np0005588920 nova_compute[226886]: 2026-01-20 14:49:42.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:43 np0005588920 nova_compute[226886]: 2026-01-20 14:49:43.287 226890 DEBUG nova.network.neutron [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:43 np0005588920 nova_compute[226886]: 2026-01-20 14:49:43.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:43.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:43 np0005588920 nova_compute[226886]: 2026-01-20 14:49:43.814 226890 DEBUG nova.network.neutron [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:43 np0005588920 nova_compute[226886]: 2026-01-20 14:49:43.836 226890 DEBUG oslo_concurrency.lockutils [req-d9b50a76-c785-400b-afe0-bd97b4a804f1 req-93e7422e-de18-4bed-a714-56dddee8848c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:45 np0005588920 nova_compute[226886]: 2026-01-20 14:49:45.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:45 np0005588920 nova_compute[226886]: 2026-01-20 14:49:45.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:45.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:46.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:49:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:47.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:49:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:47.817 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:47 np0005588920 nova_compute[226886]: 2026-01-20 14:49:47.818 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:47.819 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.243 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: 10da9204-0ccb-45d0-981d-fdff5c41cda1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.407 226890 DEBUG nova.compute.manager [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-10da9204-0ccb-45d0-981d-fdff5c41cda1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.407 226890 DEBUG nova.compute.manager [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-10da9204-0ccb-45d0-981d-fdff5c41cda1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.407 226890 DEBUG oslo_concurrency.lockutils [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.408 226890 DEBUG oslo_concurrency.lockutils [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.408 226890 DEBUG nova.network.neutron [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port 10da9204-0ccb-45d0-981d-fdff5c41cda1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:48 np0005588920 nova_compute[226886]: 2026-01-20 14:49:48.894 226890 DEBUG nova.network.neutron [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:49.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:49:49.820 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.015 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.082 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.109 226890 DEBUG nova.network.neutron [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.135 226890 DEBUG oslo_concurrency.lockutils [req-72ab6b6f-0178-4418-abc0-3c6be417ced2 req-6c1392d8-413f-4518-aad3-e813588c6168 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.529 226890 DEBUG nova.compute.manager [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.530 226890 DEBUG nova.compute.manager [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.531 226890 DEBUG oslo_concurrency.lockutils [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.532 226890 DEBUG oslo_concurrency.lockutils [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.532 226890 DEBUG nova.network.neutron [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:49:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:50.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.825 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:50 np0005588920 nova_compute[226886]: 2026-01-20 14:49:50.858 226890 DEBUG nova.network.neutron [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:51 np0005588920 nova_compute[226886]: 2026-01-20 14:49:51.512 226890 DEBUG nova.network.neutron [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:51 np0005588920 nova_compute[226886]: 2026-01-20 14:49:51.539 226890 DEBUG oslo_concurrency.lockutils [req-5bef0a21-73da-45c8-9674-8aaa84cba184 req-1361147e-09a2-470f-9226-84daa78a33c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:51.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:51 np0005588920 nova_compute[226886]: 2026-01-20 14:49:51.946 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: d2be8515-193f-43f4-bae4-d2a509320929 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:52 np0005588920 podman[264126]: 2026-01-20 14:49:52.023866167 +0000 UTC m=+0.097235082 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 20 09:49:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:52.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:52 np0005588920 nova_compute[226886]: 2026-01-20 14:49:52.934 226890 DEBUG nova.compute.manager [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:52 np0005588920 nova_compute[226886]: 2026-01-20 14:49:52.934 226890 DEBUG nova.compute.manager [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-d2be8515-193f-43f4-bae4-d2a509320929. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:52 np0005588920 nova_compute[226886]: 2026-01-20 14:49:52.935 226890 DEBUG oslo_concurrency.lockutils [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:52 np0005588920 nova_compute[226886]: 2026-01-20 14:49:52.935 226890 DEBUG oslo_concurrency.lockutils [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:52 np0005588920 nova_compute[226886]: 2026-01-20 14:49:52.935 226890 DEBUG nova.network.neutron [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port d2be8515-193f-43f4-bae4-d2a509320929 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.295 226890 DEBUG nova.network.neutron [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.353 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: e10436e2-7916-4b6b-905e-e9be7cb338b9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.512 226890 DEBUG nova.compute.manager [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.512 226890 DEBUG nova.compute.manager [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-e10436e2-7916-4b6b-905e-e9be7cb338b9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.513 226890 DEBUG oslo_concurrency.lockutils [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:53.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.891 226890 DEBUG nova.network.neutron [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.920 226890 DEBUG oslo_concurrency.lockutils [req-453f081c-4aa1-4003-b770-69a075a2ccbd req-771c2641-7568-498f-bd40-9b7ac0cb900c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.921 226890 DEBUG oslo_concurrency.lockutils [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:53 np0005588920 nova_compute[226886]: 2026-01-20 14:49:53.921 226890 DEBUG nova.network.neutron [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port e10436e2-7916-4b6b-905e-e9be7cb338b9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:54 np0005588920 nova_compute[226886]: 2026-01-20 14:49:54.176 226890 DEBUG nova.network.neutron [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 20 09:49:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:54.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.085 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.386 226890 DEBUG nova.network.neutron [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.399 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Successfully updated port: ea69e1af-9543-4c76-9981-b8475aa031fe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.411 226890 DEBUG oslo_concurrency.lockutils [req-fa6b6b93-29ce-4291-bab5-ae82a80cb416 req-a619ccb3-2c59-430f-b10b-5bdd57426c99 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.419 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.419 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.420 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.702 226890 DEBUG nova.compute.manager [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.703 226890 DEBUG nova.compute.manager [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-ea69e1af-9543-4c76-9981-b8475aa031fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.703 226890 DEBUG oslo_concurrency.lockutils [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.776 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:55.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:55 np0005588920 nova_compute[226886]: 2026-01-20 14:49:55.827 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:56.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:57.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:58.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:49:59Z|00445|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Jan 20 09:49:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:49:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:59.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:00 np0005588920 nova_compute[226886]: 2026-01-20 14:50:00.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:00.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 09:50:00 np0005588920 nova_compute[226886]: 2026-01-20 14:50:00.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:01.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:02.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:03.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 20 09:50:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:04.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 20 09:50:05 np0005588920 nova_compute[226886]: 2026-01-20 14:50:05.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588920 nova_compute[226886]: 2026-01-20 14:50:05.832 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:05.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:06.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:07.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:08.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.615 226890 DEBUG nova.network.neutron [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.663 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.663 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance network_info: |[{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.664 226890 DEBUG oslo_concurrency.lockutils [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.664 226890 DEBUG nova.network.neutron [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port ea69e1af-9543-4c76-9981-b8475aa031fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.675 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Start _get_guest_xml network_info=[{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01
Jan 20 09:50:09 np0005588920 nova_compute[226886]: k=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-8cd2ec74-aafb-4e12-a845-44b6fe96ba18', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '8cd2ec74-aafb-4e12-a845-44b6fe96ba18', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f444ccf6-5adb-489a-b174-7450017a351b', 'attached_at': '', 'detached_at': '', 'volume_id': '8cd2ec74-aafb-4e12-a845-44b6fe96ba18', 'serial': '8cd2ec74-aafb-4e12-a845-44b6fe96ba18'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '1c8151c5-460c-482c-8032-494f5d7b3a37', 'volume_type': None}, {'device_type': 'disk', 'boot_index': 1, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-25d4c4a0-3582-454d-9d4a-312b5c351d9d', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '25d4c4a0-3582-454d-9d4a-312b5c351d9d', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f444ccf6-5adb-489a-b174-7450017a351b', 'attached_at': '', 'detached_at': '', 'volume_id': '25d4c4a0-3582-454d-9d4a-312b5c351d9d', 'serial': '25d4c4a0-3582-454d-9d4a-312b5c351d9d'}, 'mount_device': '/dev/vdb', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': 'bb01cd80-4d51-458e-a034-095eecb4b63c', 'volume_type': None}, {'device_type': 'disk', 'boot_index': 2, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9a6aba77-de94-4bfe-8062-70d79455ddbe', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9a6aba77-de94-4bfe-8062-70d79455ddbe', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f444ccf6-5adb-489a-b174-7450017a351b', 'attached_at': '', 'detached_at': '', 'volume_id': '9a6aba77-de94-4bfe-8062-70d79455ddbe', 'serial': '9a6aba77-de94-4bfe-8062-70d79455ddbe'}, 'mount_device': '/dev/vdc', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': 'a8507e5b-0e79-4bf9-bb87-c0c2cd4679ea', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.679 226890 WARNING nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.687 226890 DEBUG nova.virt.libvirt.host [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.688 226890 DEBUG nova.virt.libvirt.host [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.698 226890 DEBUG nova.virt.libvirt.host [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.698 226890 DEBUG nova.virt.libvirt.host [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.699 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.700 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.700 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.701 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.701 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.701 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.701 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.702 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.702 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.702 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.702 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.703 226890 DEBUG nova.virt.hardware [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.728 226890 DEBUG nova.storage.rbd_utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] rbd image f444ccf6-5adb-489a-b174-7450017a351b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:09 np0005588920 nova_compute[226886]: 2026-01-20 14:50:09.732 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:09.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:09 np0005588920 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2026-01-20 14:50:09.675 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3567694707' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.191 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.287 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.288 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.289 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.291 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.291 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.292 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.294 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.294 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.295 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.297 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.297 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.298 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.299 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.300 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.301 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.305 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.305 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.306 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.307 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.307 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.308 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.309 226890 DEBUG nova.objects.instance [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lazy-loading 'pci_devices' on Instance uuid f444ccf6-5adb-489a-b174-7450017a351b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.330 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <uuid>f444ccf6-5adb-489a-b174-7450017a351b</uuid>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <name>instance-0000006b</name>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:name>tempest-device-tagging-server-59996163</nova:name>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:50:09</nova:creationTime>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:user uuid="1d45e7e42e6d419898780db108ff93ff">tempest-TaggedBootDevicesTest-228784294-project-member</nova:user>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:project uuid="b15c4e6eb57e4b0ca4e63c85ed92fc5f">tempest-TaggedBootDevicesTest-228784294</nova:project>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="ed97bbce-18dc-4c9b-9a04-919dd3a45a8e">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="b194a444-cc69-43f2-9931-e9e53ee450c9">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.14" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="10da9204-0ccb-45d0-981d-fdff5c41cda1">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.179" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="62ccfbd3-f504-46d0-a4af-ec2dcb7b5764">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.81" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="d2be8515-193f-43f4-bae4-d2a509320929">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.1.1.252" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="e10436e2-7916-4b6b-905e-e9be7cb338b9">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <nova:port uuid="ea69e1af-9543-4c76-9981-b8475aa031fe">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="serial">f444ccf6-5adb-489a-b174-7450017a351b</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="uuid">f444ccf6-5adb-489a-b174-7450017a351b</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f444ccf6-5adb-489a-b174-7450017a351b_disk.config">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-8cd2ec74-aafb-4e12-a845-44b6fe96ba18">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <serial>8cd2ec74-aafb-4e12-a845-44b6fe96ba18</serial>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-25d4c4a0-3582-454d-9d4a-312b5c351d9d">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <serial>25d4c4a0-3582-454d-9d4a-312b5c351d9d</serial>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-9a6aba77-de94-4bfe-8062-70d79455ddbe">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="vdc" bus="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <serial>9a6aba77-de94-4bfe-8062-70d79455ddbe</serial>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:f7:b6:ce"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="taped97bbce-18"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:2a:e3:2c"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tapb194a444-cc"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:8b:cf:d1"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tap10da9204-0c"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:a0:d2:6d"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tap62ccfbd3-f5"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:3d:83:a1"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tapd2be8515-19"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:c9:16:c9"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tape10436e2-79"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:35:77:c5"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <target dev="tapea69e1af-95"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/console.log" append="off"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:50:10 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:50:10 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:50:10 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:50:10 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.331 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.331 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.331 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.332 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.333 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.333 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.333 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.333 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.333 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.334 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Preparing to wait for external event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.335 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.336 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.336 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.337 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.337 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.337 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.338 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.338 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.341 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.341 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped97bbce-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.341 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=taped97bbce-18, col_values=(('external_ids', {'iface-id': 'ed97bbce-18dc-4c9b-9a04-919dd3a45a8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:b6:ce', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.3437] manager: (taped97bbce-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.350 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.351 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.351 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.351 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.352 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.352 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.353 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.354 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb194a444-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.355 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb194a444-cc, col_values=(('external_ids', {'iface-id': 'b194a444-cc69-43f2-9931-e9e53ee450c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:e3:2c', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.3570] manager: (tapb194a444-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.359 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.363 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.363 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.364 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.364 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.365 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.365 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.366 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.366 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.368 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.368 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10da9204-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.368 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10da9204-0c, col_values=(('external_ids', {'iface-id': '10da9204-0ccb-45d0-981d-fdff5c41cda1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:cf:d1', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.369 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.3707] manager: (tap10da9204-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.372 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.379 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.380 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.380 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.381 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.381 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.381 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.382 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.382 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.384 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62ccfbd3-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.384 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62ccfbd3-f5, col_values=(('external_ids', {'iface-id': '62ccfbd3-f504-46d0-a4af-ec2dcb7b5764', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:d2:6d', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.3864] manager: (tap62ccfbd3-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.388 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.396 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.396 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.396 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.397 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.397 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.398 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.398 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.398 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.399 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.400 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2be8515-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.400 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd2be8515-19, col_values=(('external_ids', {'iface-id': 'd2be8515-193f-43f4-bae4-d2a509320929', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:83:a1', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.401 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.4019] manager: (tapd2be8515-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.403 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.413 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.414 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.414 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.415 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.415 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.415 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.416 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.416 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.418 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.418 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape10436e2-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.419 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape10436e2-79, col_values=(('external_ids', {'iface-id': 'e10436e2-7916-4b6b-905e-e9be7cb338b9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:16:c9', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.420 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.4214] manager: (tape10436e2-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.422 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.437 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.438 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.439 226890 DEBUG nova.virt.libvirt.vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.439 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.440 226890 DEBUG nova.network.os_vif_util [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.440 226890 DEBUG os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.441 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.441 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.444 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.444 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea69e1af-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.444 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea69e1af-95, col_values=(('external_ids', {'iface-id': 'ea69e1af-9543-4c76-9981-b8475aa031fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:77:c5', 'vm-uuid': 'f444ccf6-5adb-489a-b174-7450017a351b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.445 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 NetworkManager[49076]: <info>  [1768920610.4467] manager: (tapea69e1af-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.448 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.462 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.463 226890 INFO os_vif [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95')#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.522 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.522 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.522 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] No VIF found with MAC fa:16:3e:f7:b6:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.522 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] No VIF found with MAC fa:16:3e:3d:83:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.523 226890 INFO nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Using config drive#033[00m
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.542 226890 DEBUG nova.storage.rbd_utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] rbd image f444ccf6-5adb-489a-b174-7450017a351b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:10.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:10 np0005588920 nova_compute[226886]: 2026-01-20 14:50:10.835 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:11.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.377 226890 INFO nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Creating config drive at /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config#033[00m
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.390 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6_g_fmy6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.543 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6_g_fmy6" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.573 226890 DEBUG nova.storage.rbd_utils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] rbd image f444ccf6-5adb-489a-b174-7450017a351b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.577 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config f444ccf6-5adb-489a-b174-7450017a351b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.750 226890 DEBUG oslo_concurrency.processutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config f444ccf6-5adb-489a-b174-7450017a351b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.751 226890 INFO nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Deleting local config drive /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8146] manager: (taped97bbce-18): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 20 09:50:12 np0005588920 kernel: taped97bbce-18: entered promiscuous mode
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00446|binding|INFO|Claiming lport ed97bbce-18dc-4c9b-9a04-919dd3a45a8e for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00447|binding|INFO|ed97bbce-18dc-4c9b-9a04-919dd3a45a8e: Claiming fa:16:3e:f7:b6:ce 10.100.0.10
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8328] manager: (tapb194a444-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Jan 20 09:50:12 np0005588920 systemd-udevd[264317]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:12 np0005588920 systemd-udevd[264319]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8491] manager: (tap10da9204-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.849 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:b6:ce 10.100.0.10'], port_security=['fa:16:3e:f7:b6:ce 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15443dea-d8ce-4297-a7fe-a5cbf38bfa28, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 systemd-udevd[264321]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.850 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ed97bbce-18dc-4c9b-9a04-919dd3a45a8e in datapath ff283be9-fe7c-4cc6-900d-7258ea771ba5 bound to our chassis#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.852 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff283be9-fe7c-4cc6-900d-7258ea771ba5#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8600] device (taped97bbce-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8613] device (taped97bbce-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.863 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4402ce1d-1677-4585-bb3b-023b8b860cad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.863 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff283be9-f1 in ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.865 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff283be9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.866 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[11e94d91-e0b1-4257-b4a5-e73e8cf71a28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.867 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fa072e6b-67dd-4321-8752-a435401bb25b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8692] manager: (tap62ccfbd3-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8810] manager: (tapd2be8515-19): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.882 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbb36e1-cdcb-4218-9809-5207e9869ef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 kernel: tap62ccfbd3-f5: entered promiscuous mode
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8977] manager: (tape10436e2-79): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.8988] device (tap62ccfbd3-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 kernel: tap10da9204-0c: entered promiscuous mode
Jan 20 09:50:12 np0005588920 kernel: tapd2be8515-19: entered promiscuous mode
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9003] device (tap10da9204-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 kernel: tape10436e2-79: entered promiscuous mode
Jan 20 09:50:12 np0005588920 kernel: tapb194a444-cc: entered promiscuous mode
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.899 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0dcd4b-13f9-4d1d-b4f0-1db6518000bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9013] device (tapd2be8515-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9025] device (tap62ccfbd3-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9031] device (tapb194a444-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9041] device (tap10da9204-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.903 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9046] device (tapd2be8515-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9052] device (tapb194a444-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00448|binding|INFO|Claiming lport b194a444-cc69-43f2-9931-e9e53ee450c9 for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00449|binding|INFO|b194a444-cc69-43f2-9931-e9e53ee450c9: Claiming fa:16:3e:2a:e3:2c 10.1.1.14
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00450|binding|INFO|Claiming lport 10da9204-0ccb-45d0-981d-fdff5c41cda1 for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00451|binding|INFO|10da9204-0ccb-45d0-981d-fdff5c41cda1: Claiming fa:16:3e:8b:cf:d1 10.1.1.179
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00452|binding|INFO|Claiming lport d2be8515-193f-43f4-bae4-d2a509320929 for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00453|binding|INFO|d2be8515-193f-43f4-bae4-d2a509320929: Claiming fa:16:3e:3d:83:a1 10.1.1.252
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00454|binding|INFO|Claiming lport 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00455|binding|INFO|62ccfbd3-f504-46d0-a4af-ec2dcb7b5764: Claiming fa:16:3e:a0:d2:6d 10.1.1.81
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00456|binding|INFO|Claiming lport e10436e2-7916-4b6b-905e-e9be7cb338b9 for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00457|binding|INFO|e10436e2-7916-4b6b-905e-e9be7cb338b9: Claiming fa:16:3e:c9:16:c9 10.2.2.100
Jan 20 09:50:12 np0005588920 kernel: tapea69e1af-95: entered promiscuous mode
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9116] manager: (tapea69e1af-95): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.914 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00458|binding|INFO|Claiming lport ea69e1af-9543-4c76-9981-b8475aa031fe for this chassis.
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00459|binding|INFO|ea69e1af-9543-4c76-9981-b8475aa031fe: Claiming fa:16:3e:35:77:c5 10.2.2.200
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9193] device (tape10436e2-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9200] device (tape10436e2-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.920 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:e3:2c 10.1.1.14'], port_security=['fa:16:3e:2a:e3:2c 10.1.1.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1149616215', 'neutron:cidrs': '10.1.1.14/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1149616215', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2681bf0f-676e-409e-8d3b-a85f151d084a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b194a444-cc69-43f2-9931-e9e53ee450c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.922 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:d2:6d 10.1.1.81'], port_security=['fa:16:3e:a0:d2:6d 10.1.1.81'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.81/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.923 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:cf:d1 10.1.1.179'], port_security=['fa:16:3e:8b:cf:d1 10.1.1.179'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1615926068', 'neutron:cidrs': '10.1.1.179/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1615926068', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2681bf0f-676e-409e-8d3b-a85f151d084a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=10da9204-0ccb-45d0-981d-fdff5c41cda1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.925 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:83:a1 10.1.1.252'], port_security=['fa:16:3e:3d:83:a1 10.1.1.252'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.252/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d2be8515-193f-43f4-bae4-d2a509320929) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.925 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.926 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:16:c9 10.2.2.100'], port_security=['fa:16:3e:c9:16:c9 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76110867-e0cf-4657-99dd-486c8fecc844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0117568c-b7e4-4ec7-b573-7c7e6aecac58, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e10436e2-7916-4b6b-905e-e9be7cb338b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9270] device (tapea69e1af-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9275] device (tapea69e1af-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00460|binding|INFO|Setting lport ed97bbce-18dc-4c9b-9a04-919dd3a45a8e ovn-installed in OVS
Jan 20 09:50:12 np0005588920 nova_compute[226886]: 2026-01-20 14:50:12.929 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:12Z|00461|binding|INFO|Setting lport ed97bbce-18dc-4c9b-9a04-919dd3a45a8e up in Southbound
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.931 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:77:c5 10.2.2.200'], port_security=['fa:16:3e:35:77:c5 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76110867-e0cf-4657-99dd-486c8fecc844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0117568c-b7e4-4ec7-b573-7c7e6aecac58, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ea69e1af-9543-4c76-9981-b8475aa031fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.936 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e88ff0-0f4d-40a8-8c60-85f1d8c32136]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 podman[264275]: 2026-01-20 14:50:12.942604257 +0000 UTC m=+0.160995779 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9458] manager: (tapff283be9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.945 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[77d159bb-72a8-45be-af1d-406034a0d045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 systemd-machined[196121]: New machine qemu-47-instance-0000006b.
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.974 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[48224c79-1256-4f23-aa33-41b18ddb6f6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:12.976 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc1b189-207d-4bc3-b4b4-4b0e54c3bc5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:12 np0005588920 NetworkManager[49076]: <info>  [1768920612.9980] device (tapff283be9-f0): carrier: link connected
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.004 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8f923fd3-9c89-4e95-8637-09178eed0a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 systemd[1]: Started Virtual Machine qemu-47-instance-0000006b.
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.024 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[450916d6-3157-43ac-957e-d13caf696823]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff283be9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:8c:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565468, 'reachable_time': 25748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264372, 'error': None, 'target': 'ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.040 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[19a36bb6-9402-4714-ac86-e2a9569369af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8cb8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565468, 'tstamp': 565468}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264373, 'error': None, 'target': 'ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00462|binding|INFO|Setting lport 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00463|binding|INFO|Setting lport 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 up in Southbound
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00464|binding|INFO|Setting lport ea69e1af-9543-4c76-9981-b8475aa031fe ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00465|binding|INFO|Setting lport ea69e1af-9543-4c76-9981-b8475aa031fe up in Southbound
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00466|binding|INFO|Setting lport e10436e2-7916-4b6b-905e-e9be7cb338b9 ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00467|binding|INFO|Setting lport e10436e2-7916-4b6b-905e-e9be7cb338b9 up in Southbound
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00468|binding|INFO|Setting lport b194a444-cc69-43f2-9931-e9e53ee450c9 ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00469|binding|INFO|Setting lport b194a444-cc69-43f2-9931-e9e53ee450c9 up in Southbound
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00470|binding|INFO|Setting lport d2be8515-193f-43f4-bae4-d2a509320929 ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00471|binding|INFO|Setting lport d2be8515-193f-43f4-bae4-d2a509320929 up in Southbound
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00472|binding|INFO|Setting lport 10da9204-0ccb-45d0-981d-fdff5c41cda1 ovn-installed in OVS
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00473|binding|INFO|Setting lport 10da9204-0ccb-45d0-981d-fdff5c41cda1 up in Southbound
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.056 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5386ad-85b2-4204-97ae-a029655be174]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff283be9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:8c:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565468, 'reachable_time': 25748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264374, 'error': None, 'target': 'ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.080 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4b3330-4228-4627-a79e-b7ed04082aec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.128 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e1878b44-a730-40f2-817f-522cad7b9aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.129 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff283be9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.130 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.129 226890 DEBUG nova.network.neutron [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updated VIF entry in instance network info cache for port ea69e1af-9543-4c76-9981-b8475aa031fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.130 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff283be9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.130 226890 DEBUG nova.network.neutron [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:13 np0005588920 NetworkManager[49076]: <info>  [1768920613.1325] manager: (tapff283be9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 20 09:50:13 np0005588920 kernel: tapff283be9-f0: entered promiscuous mode
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.133 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.135 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff283be9-f0, col_values=(('external_ids', {'iface-id': '2daa37e3-5efb-4b15-b751-64ce5a073b56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:13Z|00474|binding|INFO|Releasing lport 2daa37e3-5efb-4b15-b751-64ce5a073b56 from this chassis (sb_readonly=0)
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.156 226890 DEBUG oslo_concurrency.lockutils [req-5ceb9e3c-3b8e-450b-b064-9acfc073fd1e req-5fe63db3-89a3-49fc-aa06-f85d8f878052 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.173 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.174 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff283be9-fe7c-4cc6-900d-7258ea771ba5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff283be9-fe7c-4cc6-900d-7258ea771ba5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.175 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f935ea24-a38f-4444-8102-80ed5b38ce27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.176 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-ff283be9-fe7c-4cc6-900d-7258ea771ba5
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/ff283be9-fe7c-4cc6-900d-7258ea771ba5.pid.haproxy
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID ff283be9-fe7c-4cc6-900d-7258ea771ba5
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.178 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'env', 'PROCESS_TAG=haproxy-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff283be9-fe7c-4cc6-900d-7258ea771ba5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.369 226890 DEBUG nova.compute.manager [req-74b03899-1917-4756-b3e6-b4484e02a1ce req-85a60873-88c9-415f-bde0-60b922e7983e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.369 226890 DEBUG oslo_concurrency.lockutils [req-74b03899-1917-4756-b3e6-b4484e02a1ce req-85a60873-88c9-415f-bde0-60b922e7983e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.370 226890 DEBUG oslo_concurrency.lockutils [req-74b03899-1917-4756-b3e6-b4484e02a1ce req-85a60873-88c9-415f-bde0-60b922e7983e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.370 226890 DEBUG oslo_concurrency.lockutils [req-74b03899-1917-4756-b3e6-b4484e02a1ce req-85a60873-88c9-415f-bde0-60b922e7983e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.370 226890 DEBUG nova.compute.manager [req-74b03899-1917-4756-b3e6-b4484e02a1ce req-85a60873-88c9-415f-bde0-60b922e7983e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.439 226890 DEBUG nova.compute.manager [req-7d2ac111-d412-4e86-86cc-990c1673a727 req-717d0ea8-5f80-4273-bbf9-6a95d8f4fa2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.439 226890 DEBUG oslo_concurrency.lockutils [req-7d2ac111-d412-4e86-86cc-990c1673a727 req-717d0ea8-5f80-4273-bbf9-6a95d8f4fa2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.439 226890 DEBUG oslo_concurrency.lockutils [req-7d2ac111-d412-4e86-86cc-990c1673a727 req-717d0ea8-5f80-4273-bbf9-6a95d8f4fa2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.440 226890 DEBUG oslo_concurrency.lockutils [req-7d2ac111-d412-4e86-86cc-990c1673a727 req-717d0ea8-5f80-4273-bbf9-6a95d8f4fa2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.440 226890 DEBUG nova.compute.manager [req-7d2ac111-d412-4e86-86cc-990c1673a727 req-717d0ea8-5f80-4273-bbf9-6a95d8f4fa2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:13 np0005588920 podman[264450]: 2026-01-20 14:50:13.558534345 +0000 UTC m=+0.050883999 container create 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:50:13 np0005588920 systemd[1]: Started libpod-conmon-025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61.scope.
Jan 20 09:50:13 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:50:13 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d73cbc4f713f6586fdc8003c7d2ef301cffb202142b5ff8961b3a588c1f5dfeb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:13 np0005588920 podman[264450]: 2026-01-20 14:50:13.532458689 +0000 UTC m=+0.024808363 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:13 np0005588920 podman[264450]: 2026-01-20 14:50:13.638358211 +0000 UTC m=+0.130707895 container init 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:50:13 np0005588920 podman[264450]: 2026-01-20 14:50:13.643906115 +0000 UTC m=+0.136255769 container start 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:50:13 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [NOTICE]   (264515) : New worker (264518) forked
Jan 20 09:50:13 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [NOTICE]   (264515) : Loading success.
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.701 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920613.7006366, f444ccf6-5adb-489a-b174-7450017a351b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.703 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.725 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.730 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920613.7013443, f444ccf6-5adb-489a-b174-7450017a351b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.731 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.735 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b194a444-cc69-43f2-9931-e9e53ee450c9 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.738 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfaa226a-b6e0-41ba-a3f5-d7b004368355#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.749 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26c27579-8b4f-481d-b794-53e27f60d279]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.750 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcfaa226a-b1 in ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.752 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcfaa226a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.752 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0b345edc-2efd-475d-8b54-7d4cdaa03382]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.753 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0da3eea8-3a0e-473c-bd59-466164c8c57f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.754 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.759 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.767 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[27a6cb8d-7a87-4367-8efd-426bb4fb0746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.790 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b4dac3-39e9-4e13-a272-5cf3ccb8f9c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.810 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.818 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ec95bf-16fb-4f9a-b853-af81f1f7e90b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.823 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4be2a533-5c59-41ea-8352-d7441c564470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 NetworkManager[49076]: <info>  [1768920613.8244] manager: (tapcfaa226a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Jan 20 09:50:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.856 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[09ec2c47-1b4c-48eb-916d-e5005e4a84bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.859 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4559e258-0760-4c1c-a2e5-d0cd6b0a12ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 NetworkManager[49076]: <info>  [1768920613.8831] device (tapcfaa226a-b0): carrier: link connected
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.886 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[65f002a3-c1d4-44f2-b9df-ff9aa58c419f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.902 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4186ec-6c58-495f-bab7-c24ba3f887ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfaa226a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:5f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565556, 'reachable_time': 41013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264537, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.905 226890 DEBUG nova.compute.manager [req-e7de9a82-4c54-45b5-9e24-7690355fb9c2 req-f48f56d3-57f9-4fb9-b0ff-3725023c52fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.905 226890 DEBUG oslo_concurrency.lockutils [req-e7de9a82-4c54-45b5-9e24-7690355fb9c2 req-f48f56d3-57f9-4fb9-b0ff-3725023c52fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.906 226890 DEBUG oslo_concurrency.lockutils [req-e7de9a82-4c54-45b5-9e24-7690355fb9c2 req-f48f56d3-57f9-4fb9-b0ff-3725023c52fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.906 226890 DEBUG oslo_concurrency.lockutils [req-e7de9a82-4c54-45b5-9e24-7690355fb9c2 req-f48f56d3-57f9-4fb9-b0ff-3725023c52fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:13 np0005588920 nova_compute[226886]: 2026-01-20 14:50:13.906 226890 DEBUG nova.compute.manager [req-e7de9a82-4c54-45b5-9e24-7690355fb9c2 req-f48f56d3-57f9-4fb9-b0ff-3725023c52fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.920 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4f209efb-065e-4633-a5ff-40b248619c10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:5fda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565556, 'tstamp': 565556}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264538, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.939 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e619e3ee-f7d5-4b5e-9b00-69a8af8ff51a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfaa226a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:5f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565556, 'reachable_time': 41013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264539, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:13.969 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30a4aa7d-a007-4249-9a01-d9dcb7e3c832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.034 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8532008-32af-47dc-b48d-fd37eb019172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.036 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfaa226a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.036 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.036 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfaa226a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.038 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 NetworkManager[49076]: <info>  [1768920614.0393] manager: (tapcfaa226a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Jan 20 09:50:14 np0005588920 kernel: tapcfaa226a-b0: entered promiscuous mode
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.041 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.041 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfaa226a-b0, col_values=(('external_ids', {'iface-id': '7e4baad3-091e-4d38-8db5-dc6d66194858'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.042 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:14Z|00475|binding|INFO|Releasing lport 7e4baad3-091e-4d38-8db5-dc6d66194858 from this chassis (sb_readonly=0)
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.062 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cfaa226a-b6e0-41ba-a3f5-d7b004368355.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cfaa226a-b6e0-41ba-a3f5-d7b004368355.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.063 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e238d5f4-7ba2-4a7c-9db1-72bd2a099dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.064 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-cfaa226a-b6e0-41ba-a3f5-d7b004368355
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/cfaa226a-b6e0-41ba-a3f5-d7b004368355.pid.haproxy
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID cfaa226a-b6e0-41ba-a3f5-d7b004368355
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.065 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'env', 'PROCESS_TAG=haproxy-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cfaa226a-b6e0-41ba-a3f5-d7b004368355.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2969810977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:14 np0005588920 podman[264571]: 2026-01-20 14:50:14.391699391 +0000 UTC m=+0.041236351 container create f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:14 np0005588920 systemd[1]: Started libpod-conmon-f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4.scope.
Jan 20 09:50:14 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:50:14 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dc8c79443c86075c67bf9bb3d9c08f01c14647c4fad70ca2f6d5fe0d4dace2e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:14 np0005588920 podman[264571]: 2026-01-20 14:50:14.369574444 +0000 UTC m=+0.019111404 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:14 np0005588920 podman[264571]: 2026-01-20 14:50:14.47598068 +0000 UTC m=+0.125517640 container init f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:14 np0005588920 podman[264571]: 2026-01-20 14:50:14.487638705 +0000 UTC m=+0.137175645 container start f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:50:14 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [NOTICE]   (264591) : New worker (264593) forked
Jan 20 09:50:14 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [NOTICE]   (264591) : Loading success.
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.534 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.537 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfaa226a-b6e0-41ba-a3f5-d7b004368355#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.553 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[74721b8e-2b2f-41a7-a959-4488515fa371]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.581 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ed33b4-42d6-4250-b7c0-4159456a009c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.584 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[72af380b-ea03-418d-9b87-5b4b472ba331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.612 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[92c66cae-d093-4911-9480-79f8e0ba4c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.627 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[69a530e7-e988-4c7b-98e9-08285fc0686d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfaa226a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:5f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 6, 'rx_bytes': 176, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 6, 'rx_bytes': 176, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565556, 'reachable_time': 41013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264607, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.644 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aa54e952-8ca3-415b-9189-49089a6e9d4f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565568, 'tstamp': 565568}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264608, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565571, 'tstamp': 565571}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264608, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.646 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfaa226a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.649 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfaa226a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.649 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.649 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfaa226a-b0, col_values=(('external_ids', {'iface-id': '7e4baad3-091e-4d38-8db5-dc6d66194858'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.650 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:14.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.651 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 10da9204-0ccb-45d0-981d-fdff5c41cda1 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.653 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfaa226a-b6e0-41ba-a3f5-d7b004368355#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.667 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba916495-4ca1-47c3-9ab4-2f6c8aabf20b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.695 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7318d6-5ca5-439e-88d2-3fc4699e0a19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.698 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[41eb3421-97c1-4e41-8077-fce0bd5819bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.724 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[222e78ad-74ab-4be4-9c78-705193f05150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.746 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[64ef44c4-a92b-4961-98d8-1111a161b16e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfaa226a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:5f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 266, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 8, 'rx_bytes': 266, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565556, 'reachable_time': 41013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264614, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.761 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[50981924-37af-42b2-943f-a8df8e9c3860]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565568, 'tstamp': 565568}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264615, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565571, 'tstamp': 565571}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264615, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.763 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfaa226a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.765 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.767 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfaa226a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.767 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.768 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfaa226a-b0, col_values=(('external_ids', {'iface-id': '7e4baad3-091e-4d38-8db5-dc6d66194858'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.768 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.769 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d2be8515-193f-43f4-bae4-d2a509320929 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.771 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfaa226a-b6e0-41ba-a3f5-d7b004368355#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.785 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[828c2225-78c5-4f0b-ba07-b1e88fd66a9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.816 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[74b591b2-46b8-403e-b3de-33d8c4ed1657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.819 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d96c1d-4c6f-4f56-b4de-37d43ee3ba50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.850 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cff97c76-d197-4dae-8958-de753911c2b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.870 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[22f43ed8-0fac-49b0-9358-8beed810633b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfaa226a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:5f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 10, 'rx_bytes': 266, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 10, 'rx_bytes': 266, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565556, 'reachable_time': 41013, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264621, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.888 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba0747c-8545-49d6-b715-1aa465c6df7f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565568, 'tstamp': 565568}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264622, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapcfaa226a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565571, 'tstamp': 565571}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264622, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.890 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfaa226a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 nova_compute[226886]: 2026-01-20 14:50:14.893 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.893 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfaa226a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.893 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.894 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfaa226a-b0, col_values=(('external_ids', {'iface-id': '7e4baad3-091e-4d38-8db5-dc6d66194858'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.894 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.895 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e10436e2-7916-4b6b-905e-e9be7cb338b9 in datapath 76110867-e0cf-4657-99dd-486c8fecc844 unbound from our chassis#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.896 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76110867-e0cf-4657-99dd-486c8fecc844#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.909 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[929e61c3-659a-4f4a-bc87-4d29c3066bc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.909 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap76110867-e1 in ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.911 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap76110867-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.911 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7b975080-1ce1-41f0-96eb-4a650bdd7afc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.912 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[21e3066a-ed0d-4ead-928b-7b4ee2c643d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.923 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[06eaf8c0-6dc4-4e8d-b1db-2980b4a33f91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.935 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ebef4015-6ddf-4267-906f-bb4af269bedd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.969 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[88f7cf31-3ce7-4618-8e51-a72bb625ca21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:14 np0005588920 NetworkManager[49076]: <info>  [1768920614.9756] manager: (tap76110867-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Jan 20 09:50:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:14.975 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d64e4e4-8b7b-428a-a5af-96f4544774d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.005 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[90880344-01ce-4c5a-a6d9-56e088c72dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.008 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0081d31f-0f6f-4b4e-9d92-cafbe12777ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 NetworkManager[49076]: <info>  [1768920615.0386] device (tap76110867-e0): carrier: link connected
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.044 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[30beebeb-6043-4744-bd0b-f8c1af173156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.063 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[34e2aabb-f031-4769-9941-f6562e4b1fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76110867-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:91:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565672, 'reachable_time': 32946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264633, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.081 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb72e93-0e1a-41e1-8c37-ed0129e14bc7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:91e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565672, 'tstamp': 565672}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264634, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.102 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1bfd35-838f-4826-9ab2-2028ea8ff5af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76110867-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:91:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565672, 'reachable_time': 32946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264635, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.138 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5c916b97-1ab3-41e6-bb4d-aa6f1c6b5c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.197 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e3770d-d442-4e63-88f8-428c63baa189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.198 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76110867-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.198 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.199 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76110867-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 NetworkManager[49076]: <info>  [1768920615.2010] manager: (tap76110867-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 20 09:50:15 np0005588920 kernel: tap76110867-e0: entered promiscuous mode
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.204 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76110867-e0, col_values=(('external_ids', {'iface-id': '4b5ed097-4982-4e56-9d26-8c4481a0c10e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.205 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:15Z|00476|binding|INFO|Releasing lport 4b5ed097-4982-4e56-9d26-8c4481a0c10e from this chassis (sb_readonly=0)
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.217 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/76110867-e0cf-4657-99dd-486c8fecc844.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/76110867-e0cf-4657-99dd-486c8fecc844.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.218 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[15394c62-9b17-4799-a5e1-55723f518f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.219 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-76110867-e0cf-4657-99dd-486c8fecc844
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/76110867-e0cf-4657-99dd-486c8fecc844.pid.haproxy
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 76110867-e0cf-4657-99dd-486c8fecc844
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.220 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'env', 'PROCESS_TAG=haproxy-76110867-e0cf-4657-99dd-486c8fecc844', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/76110867-e0cf-4657-99dd-486c8fecc844.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.446 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.476 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.476 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.477 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.477 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.477 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No event matching network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e in dict_keys([('network-vif-plugged', '10da9204-0ccb-45d0-981d-fdff5c41cda1'), ('network-vif-plugged', '62ccfbd3-f504-46d0-a4af-ec2dcb7b5764'), ('network-vif-plugged', 'd2be8515-193f-43f4-bae4-d2a509320929'), ('network-vif-plugged', 'e10436e2-7916-4b6b-905e-e9be7cb338b9')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.478 226890 WARNING nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.478 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.478 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.478 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.479 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.479 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.479 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.480 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.480 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.480 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.480 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No event matching network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 in dict_keys([('network-vif-plugged', '10da9204-0ccb-45d0-981d-fdff5c41cda1'), ('network-vif-plugged', '62ccfbd3-f504-46d0-a4af-ec2dcb7b5764'), ('network-vif-plugged', 'e10436e2-7916-4b6b-905e-e9be7cb338b9')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.481 226890 WARNING nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.481 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.481 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.481 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.482 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.482 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.482 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.483 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.483 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.483 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.483 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No event matching network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 in dict_keys([('network-vif-plugged', '10da9204-0ccb-45d0-981d-fdff5c41cda1'), ('network-vif-plugged', 'e10436e2-7916-4b6b-905e-e9be7cb338b9')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.484 226890 WARNING nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.484 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.484 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.484 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.485 226890 DEBUG oslo_concurrency.lockutils [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.485 226890 DEBUG nova.compute.manager [req-d9270a55-2d7d-4b7a-bdfe-d7a524f067a7 req-7b5cbfb3-87df-4627-ada6-4016f475bd26 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:15 np0005588920 podman[264665]: 2026-01-20 14:50:15.567695963 +0000 UTC m=+0.061528796 container create 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.569 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.569 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.570 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.570 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.570 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No event matching network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe in dict_keys([('network-vif-plugged', '10da9204-0ccb-45d0-981d-fdff5c41cda1')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.571 226890 WARNING nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.571 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.571 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.572 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.572 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.573 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Processing event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.573 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.573 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.574 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.574 226890 DEBUG oslo_concurrency.lockutils [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.574 226890 DEBUG nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.574 226890 WARNING nova.compute.manager [req-43e47545-d9e6-4ccf-a07e-9b65d5b88a37 req-25845b54-f4ff-4d1a-a154-9ea0b6e56f08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.575 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.578 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920615.5786142, f444ccf6-5adb-489a-b174-7450017a351b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.578 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.580 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.583 226890 INFO nova.virt.libvirt.driver [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance spawned successfully.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.583 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.597 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.603 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.606 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.607 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.607 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.608 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.608 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.609 226890 DEBUG nova.virt.libvirt.driver [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:15 np0005588920 systemd[1]: Started libpod-conmon-5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e.scope.
Jan 20 09:50:15 np0005588920 podman[264665]: 2026-01-20 14:50:15.533236013 +0000 UTC m=+0.027068936 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:15 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:50:15 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a07298257a630114c8a4c22ea64dc59d68781fadc5f960af57c6b4e36e1518f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.655 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:15 np0005588920 podman[264665]: 2026-01-20 14:50:15.660550992 +0000 UTC m=+0.154383835 container init 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:15 np0005588920 podman[264665]: 2026-01-20 14:50:15.665626043 +0000 UTC m=+0.159458876 container start 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:50:15 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [NOTICE]   (264684) : New worker (264686) forked
Jan 20 09:50:15 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [NOTICE]   (264684) : Loading success.
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.708 226890 INFO nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Took 43.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.709 226890 DEBUG nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.714 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ea69e1af-9543-4c76-9981-b8475aa031fe in datapath 76110867-e0cf-4657-99dd-486c8fecc844 unbound from our chassis#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.716 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76110867-e0cf-4657-99dd-486c8fecc844#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.730 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ccbb0910-fec0-43f2-ac5c-e0515a7d8c28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.758 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec8b13d-d04b-4b0f-8921-689b54d210bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.761 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[afab2756-2369-41cf-80ae-cbb8f10c0860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.791 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b34b26a7-2855-4fe6-af01-d88cb4b1688b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.807 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[19796710-1c2f-42b0-a0d8-3b57bbab1930]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76110867-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:91:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 5, 'rx_bytes': 180, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 5, 'rx_bytes': 180, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565672, 'reachable_time': 32946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264700, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.820 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[de6561f9-f01f-4957-8616-72c9393bde54]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap76110867-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565684, 'tstamp': 565684}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264701, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tap76110867-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565687, 'tstamp': 565687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264701, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.822 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76110867-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.825 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76110867-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.826 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.826 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76110867-e0, col_values=(('external_ids', {'iface-id': '4b5ed097-4982-4e56-9d26-8c4481a0c10e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:15.826 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.832 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.838 226890 INFO nova.compute.manager [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Took 49.84 seconds to build instance.#033[00m
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:15.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:15 np0005588920 nova_compute[226886]: 2026-01-20 14:50:15.874 226890 DEBUG oslo_concurrency.lockutils [None req-b5c45e92-9f6d-41ec-8c78-45afd172d4ea 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 49.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:16.451 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:16.452 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:16.453 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:16.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.876 226890 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.877 226890 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.878 226890 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.878 226890 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.879 226890 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:16 np0005588920 nova_compute[226886]: 2026-01-20 14:50:16.879 226890 WARNING nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:50:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.615 226890 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.616 226890 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.616 226890 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.617 226890 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.617 226890 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:17 np0005588920 nova_compute[226886]: 2026-01-20 14:50:17.618 226890 WARNING nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:50:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:17.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:18.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 20 09:50:19 np0005588920 NetworkManager[49076]: <info>  [1768920619.3780] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:19 np0005588920 NetworkManager[49076]: <info>  [1768920619.3794] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.573 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:19Z|00477|binding|INFO|Releasing lport 2daa37e3-5efb-4b15-b751-64ce5a073b56 from this chassis (sb_readonly=0)
Jan 20 09:50:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:19Z|00478|binding|INFO|Releasing lport 4b5ed097-4982-4e56-9d26-8c4481a0c10e from this chassis (sb_readonly=0)
Jan 20 09:50:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:19Z|00479|binding|INFO|Releasing lport 7e4baad3-091e-4d38-8db5-dc6d66194858 from this chassis (sb_readonly=0)
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.598 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.807 226890 DEBUG nova.compute.manager [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-changed-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.807 226890 DEBUG nova.compute.manager [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing instance network info cache due to event network-changed-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.807 226890 DEBUG oslo_concurrency.lockutils [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.808 226890 DEBUG oslo_concurrency.lockutils [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:19 np0005588920 nova_compute[226886]: 2026-01-20 14:50:19.808 226890 DEBUG nova.network.neutron [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Refreshing network info cache for port ed97bbce-18dc-4c9b-9a04-919dd3a45a8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:50:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:19.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:20 np0005588920 nova_compute[226886]: 2026-01-20 14:50:20.448 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588920 nova_compute[226886]: 2026-01-20 14:50:20.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:20.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:20 np0005588920 nova_compute[226886]: 2026-01-20 14:50:20.839 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:21.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:21 np0005588920 nova_compute[226886]: 2026-01-20 14:50:21.932 226890 DEBUG nova.network.neutron [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updated VIF entry in instance network info cache for port ed97bbce-18dc-4c9b-9a04-919dd3a45a8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:50:21 np0005588920 nova_compute[226886]: 2026-01-20 14:50:21.934 226890 DEBUG nova.network.neutron [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:21 np0005588920 nova_compute[226886]: 2026-01-20 14:50:21.968 226890 DEBUG oslo_concurrency.lockutils [req-06bd33ad-e933-491f-85e6-966691e3004e req-7a111655-e276-406c-928a-c6ee425c1b5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:22.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:22 np0005588920 podman[264703]: 2026-01-20 14:50:22.991524251 +0000 UTC m=+0.069435107 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:50:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:23.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:24.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:25 np0005588920 nova_compute[226886]: 2026-01-20 14:50:25.451 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:25 np0005588920 nova_compute[226886]: 2026-01-20 14:50:25.843 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:25.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:26.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3209917238' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:27.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:28.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 20 09:50:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:29.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:cf:d1 10.1.1.179
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:cf:d1 10.1.1.179
Jan 20 09:50:30 np0005588920 nova_compute[226886]: 2026-01-20 14:50:30.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:d2:6d 10.1.1.81
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:d2:6d 10.1.1.81
Jan 20 09:50:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:30.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:30 np0005588920 nova_compute[226886]: 2026-01-20 14:50:30.845 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:83:a1 10.1.1.252
Jan 20 09:50:30 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:30Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:83:a1 10.1.1.252
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2a:e3:2c 10.1.1.14
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2a:e3:2c 10.1.1.14
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:77:c5 10.2.2.200
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:77:c5 10.2.2.200
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:b6:ce 10.100.0.10
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:b6:ce 10.100.0.10
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:16:c9 10.2.2.100
Jan 20 09:50:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:31Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:16:c9 10.2.2.100
Jan 20 09:50:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:31.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.117 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.117 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.137 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.231 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.232 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.239 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.240 226890 INFO nova.compute.claims [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.355 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:32.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2955992127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.794 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.803 226890 DEBUG nova.compute.provider_tree [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.827 226890 DEBUG nova.scheduler.client.report [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.873 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.874 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.918 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.918 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.942 226890 INFO nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:50:32 np0005588920 nova_compute[226886]: 2026-01-20 14:50:32.968 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.060 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.062 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.063 226890 INFO nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Creating image(s)#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.092 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.120 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.149 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.152 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.214 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.216 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.216 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.217 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.244 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.247 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.296 226890 DEBUG nova.policy [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e9278fdb9e645b7938f3edb20c4d3cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.499 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.251s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.552 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] resizing rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.632 226890 DEBUG nova.objects.instance [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.654 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.654 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Ensure instance console log exists: /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.655 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.655 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.655 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:50:33 np0005588920 nova_compute[226886]: 2026-01-20 14:50:33.744 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:50:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:33.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:34 np0005588920 nova_compute[226886]: 2026-01-20 14:50:34.055 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:34 np0005588920 nova_compute[226886]: 2026-01-20 14:50:34.055 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:34 np0005588920 nova_compute[226886]: 2026-01-20 14:50:34.055 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:50:34 np0005588920 nova_compute[226886]: 2026-01-20 14:50:34.055 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f444ccf6-5adb-489a-b174-7450017a351b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:34 np0005588920 nova_compute[226886]: 2026-01-20 14:50:34.178 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Successfully created port: 2c9f3e71-2562-4ae0-bf22-d56553a40405 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:50:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:34.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:35.100 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:35.101 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.125 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Successfully updated port: 2c9f3e71-2562-4ae0-bf22-d56553a40405 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.144 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.144 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.144 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.277 226890 DEBUG nova.compute.manager [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-changed-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.277 226890 DEBUG nova.compute.manager [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Refreshing instance network info cache due to event network-changed-2c9f3e71-2562-4ae0-bf22-d56553a40405. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.278 226890 DEBUG oslo_concurrency.lockutils [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.377 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:35 np0005588920 nova_compute[226886]: 2026-01-20 14:50:35.848 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:35.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 20 09:50:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:36.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.860 226890 DEBUG nova.network.neutron [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.956 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.956 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance network_info: |[{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.956 226890 DEBUG oslo_concurrency.lockutils [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.957 226890 DEBUG nova.network.neutron [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Refreshing network info cache for port 2c9f3e71-2562-4ae0-bf22-d56553a40405 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.959 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start _get_guest_xml network_info=[{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.964 226890 WARNING nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.969 226890 DEBUG nova.virt.libvirt.host [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.970 226890 DEBUG nova.virt.libvirt.host [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.973 226890 DEBUG nova.virt.libvirt.host [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.973 226890 DEBUG nova.virt.libvirt.host [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.974 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.975 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.975 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.975 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.975 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.976 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.976 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.976 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.976 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.977 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.977 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.977 226890 DEBUG nova.virt.hardware [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:50:36 np0005588920 nova_compute[226886]: 2026-01-20 14:50:36.980 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:37.102 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3655918105' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.400 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.427 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.431 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:37.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2564209469' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.913 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.914 226890 DEBUG nova.virt.libvirt.vif [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.915 226890 DEBUG nova.network.os_vif_util [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.915 226890 DEBUG nova.network.os_vif_util [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.916 226890 DEBUG nova.objects.instance [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.940 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <uuid>91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</uuid>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <name>instance-0000006f</name>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1254775729</nova:name>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:50:36</nova:creationTime>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <nova:port uuid="2c9f3e71-2562-4ae0-bf22-d56553a40405">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="serial">91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="uuid">91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cb:72:0c"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <target dev="tap2c9f3e71-25"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/console.log" append="off"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:50:37 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:50:37 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:50:37 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:50:37 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.942 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Preparing to wait for external event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.942 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.942 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.943 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.944 226890 DEBUG nova.virt.libvirt.vif [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.944 226890 DEBUG nova.network.os_vif_util [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.945 226890 DEBUG nova.network.os_vif_util [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.945 226890 DEBUG os_vif [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.946 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.946 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.950 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.950 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c9f3e71-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.951 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c9f3e71-25, col_values=(('external_ids', {'iface-id': '2c9f3e71-2562-4ae0-bf22-d56553a40405', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:72:0c', 'vm-uuid': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.952 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:37 np0005588920 NetworkManager[49076]: <info>  [1768920637.9537] manager: (tap2c9f3e71-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.955 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:37 np0005588920 nova_compute[226886]: 2026-01-20 14:50:37.963 226890 INFO os_vif [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25')#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.212 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.213 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.214 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:cb:72:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.214 226890 INFO nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Using config drive#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.246 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:50:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:38.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.894 226890 INFO nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Creating config drive at /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config#033[00m
Jan 20 09:50:38 np0005588920 nova_compute[226886]: 2026-01-20 14:50:38.900 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpow90tpe1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.029 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpow90tpe1" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.057 226890 DEBUG nova.storage.rbd_utils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.062 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.210 226890 DEBUG oslo_concurrency.processutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.212 226890 INFO nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Deleting local config drive /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/disk.config because it was imported into RBD.#033[00m
Jan 20 09:50:39 np0005588920 kernel: tap2c9f3e71-25: entered promiscuous mode
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.2552] manager: (tap2c9f3e71-25): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Jan 20 09:50:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:39Z|00480|binding|INFO|Claiming lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 for this chassis.
Jan 20 09:50:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:39Z|00481|binding|INFO|2c9f3e71-2562-4ae0-bf22-d56553a40405: Claiming fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.257 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.266 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.269 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:50:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:39Z|00482|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 ovn-installed in OVS
Jan 20 09:50:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:39Z|00483|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 up in Southbound
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.273 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.274 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.286 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ff90ea-4072-4511-be5c-3313f11a91a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.288 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:39 np0005588920 systemd-machined[196121]: New machine qemu-48-instance-0000006f.
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.290 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.290 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[db121eaa-c409-4938-b7f3-bdf4e9f5a1f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.291 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[afa8b0ab-6ad1-46f4-a147-196bf2e3e85e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.301 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[01f4a8f0-44d8-4ca7-ad9a-7a3630479c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 systemd[1]: Started Virtual Machine qemu-48-instance-0000006f.
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.325 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c246fdcf-a352-472a-9d92-24282a323a6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 systemd-udevd[265052]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.3410] device (tap2c9f3e71-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.3421] device (tap2c9f3e71-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.355 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7d1323-b8c0-4c00-818d-4032a64a28a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.362 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a59fc719-9d0c-4513-90d5-2c0500ec2ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.3641] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/249)
Jan 20 09:50:39 np0005588920 systemd-udevd[265058]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.395 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[24d2466e-5b2f-4a02-a1c9-221c142a0c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.398 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f6519d-9e05-4545-9060-753907e3ee33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.4180] device (tap762e1859-40): carrier: link connected
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.421 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e12611-7ca4-4d47-ad43-105d141fb82d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.436 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7691ff-f9dc-468c-9a38-c4640146ffcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568110, 'reachable_time': 26307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265084, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.449 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef00eaa-2af8-4aca-a789-a58b50093243]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568110, 'tstamp': 568110}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265085, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.462 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4aaded53-56f4-4c98-90f7-3bd12a44cc40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568110, 'reachable_time': 26307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265086, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.490 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d33d9858-39de-493b-ab91-b62f1f15d7fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.550 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9dce2c-f638-4be9-b0c4-02bd05674db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.552 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.552 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.552 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.554 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:50:39 np0005588920 NetworkManager[49076]: <info>  [1768920639.5586] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.560 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:39Z|00484|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.565 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.566 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c53d3e9-d755-4a2c-88fa-86282a08d156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.568 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:39.569 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.704 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920639.7038996, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.705 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Started (Lifecycle Event)#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.744 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.748 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920639.7040074, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.749 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.787 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.790 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.829 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:39.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.881 226890 DEBUG nova.network.neutron [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updated VIF entry in instance network info cache for port 2c9f3e71-2562-4ae0-bf22-d56553a40405. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.882 226890 DEBUG nova.network.neutron [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:39 np0005588920 nova_compute[226886]: 2026-01-20 14:50:39.909 226890 DEBUG oslo_concurrency.lockutils [req-589afc55-91e9-4192-a6f1-74eb65925c11 req-2abc6b4a-275d-4ed6-991c-ba9b9e241ec1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:39 np0005588920 podman[265159]: 2026-01-20 14:50:39.922055084 +0000 UTC m=+0.045368635 container create ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:50:39 np0005588920 systemd[1]: Started libpod-conmon-ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469.scope.
Jan 20 09:50:39 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:50:39 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8547778e3af0e305b2d9b868cb26df82069846e48ececd61b645d71fe6cd5a72/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:39 np0005588920 podman[265159]: 2026-01-20 14:50:39.899222828 +0000 UTC m=+0.022536399 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:40 np0005588920 podman[265159]: 2026-01-20 14:50:40.002094876 +0000 UTC m=+0.125408447 container init ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:40 np0005588920 podman[265159]: 2026-01-20 14:50:40.009605475 +0000 UTC m=+0.132919036 container start ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:50:40 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [NOTICE]   (265178) : New worker (265180) forked
Jan 20 09:50:40 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [NOTICE]   (265178) : Loading success.
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.380 226890 DEBUG nova.compute.manager [req-93f08ed5-d873-4269-bf25-41509ffa99a1 req-d9fa6224-49a5-4288-bcb3-10df6a1fd615 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.381 226890 DEBUG oslo_concurrency.lockutils [req-93f08ed5-d873-4269-bf25-41509ffa99a1 req-d9fa6224-49a5-4288-bcb3-10df6a1fd615 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.382 226890 DEBUG oslo_concurrency.lockutils [req-93f08ed5-d873-4269-bf25-41509ffa99a1 req-d9fa6224-49a5-4288-bcb3-10df6a1fd615 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.382 226890 DEBUG oslo_concurrency.lockutils [req-93f08ed5-d873-4269-bf25-41509ffa99a1 req-d9fa6224-49a5-4288-bcb3-10df6a1fd615 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.383 226890 DEBUG nova.compute.manager [req-93f08ed5-d873-4269-bf25-41509ffa99a1 req-d9fa6224-49a5-4288-bcb3-10df6a1fd615 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Processing event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.384 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.390 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920640.389653, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.390 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.393 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.399 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance spawned successfully.#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.401 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.425 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.433 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.439 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.440 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.441 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.442 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.442 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.443 226890 DEBUG nova.virt.libvirt.driver [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.476 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.503 226890 INFO nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Took 7.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.504 226890 DEBUG nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.566 226890 INFO nova.compute.manager [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Took 8.36 seconds to build instance.#033[00m
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.580 226890 DEBUG oslo_concurrency.lockutils [None req-381352e2-c980-4981-911f-0c3280388b70 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:40.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:40 np0005588920 nova_compute[226886]: 2026-01-20 14:50:40.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:41.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.459 226890 DEBUG nova.compute.manager [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.459 226890 DEBUG oslo_concurrency.lockutils [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.459 226890 DEBUG oslo_concurrency.lockutils [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.460 226890 DEBUG oslo_concurrency.lockutils [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.460 226890 DEBUG nova.compute.manager [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.460 226890 WARNING nova.compute.manager [req-54bc073d-7696-4b14-8610-1beb144db1ab req-752b96aa-c3aa-44e8-8ba1-7a4cc083ace9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:50:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:42.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:42 np0005588920 nova_compute[226886]: 2026-01-20 14:50:42.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:43 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 20 09:50:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:43.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:44 np0005588920 podman[265189]: 2026-01-20 14:50:44.036462368 +0000 UTC m=+0.103439654 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:50:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 20 09:50:44 np0005588920 nova_compute[226886]: 2026-01-20 14:50:44.591 226890 DEBUG nova.compute.manager [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-changed-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:44 np0005588920 nova_compute[226886]: 2026-01-20 14:50:44.592 226890 DEBUG nova.compute.manager [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Refreshing instance network info cache due to event network-changed-2c9f3e71-2562-4ae0-bf22-d56553a40405. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:50:44 np0005588920 nova_compute[226886]: 2026-01-20 14:50:44.592 226890 DEBUG oslo_concurrency.lockutils [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:44 np0005588920 nova_compute[226886]: 2026-01-20 14:50:44.593 226890 DEBUG oslo_concurrency.lockutils [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:44 np0005588920 nova_compute[226886]: 2026-01-20 14:50:44.593 226890 DEBUG nova.network.neutron [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Refreshing network info cache for port 2c9f3e71-2562-4ae0-bf22-d56553a40405 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:50:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:44.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.422 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.454 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-f444ccf6-5adb-489a-b174-7450017a351b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.455 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.455 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.456 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.456 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.456 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.456 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.457 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.457 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.457 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.475 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.475 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.476 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.476 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.476 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.854 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.873 226890 DEBUG nova.network.neutron [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updated VIF entry in instance network info cache for port 2c9f3e71-2562-4ae0-bf22-d56553a40405. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.874 226890 DEBUG nova.network.neutron [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:45.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.889 226890 DEBUG oslo_concurrency.lockutils [req-210f6cc4-9bfe-4a91-9ba8-51a9315cdaf3 req-8f553dc5-2d38-470d-ab8c-a3580150266d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/203403028' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.908 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:45 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.984 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.984 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.988 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.988 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.988 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:45 np0005588920 nova_compute[226886]: 2026-01-20 14:50:45.988 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.204 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.205 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4079MB free_disk=20.92176055908203GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.205 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.206 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.269 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance f444ccf6-5adb-489a-b174-7450017a351b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.270 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.270 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.270 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.317 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:46.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2235680583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.730 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.735 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.762 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.783 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:50:46 np0005588920 nova_compute[226886]: 2026-01-20 14:50:46.784 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:47.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:47 np0005588920 nova_compute[226886]: 2026-01-20 14:50:47.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:48.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:49.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:50.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:50 np0005588920 nova_compute[226886]: 2026-01-20 14:50:50.857 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:51.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:52.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:52 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:52 np0005588920 nova_compute[226886]: 2026-01-20 14:50:52.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:53Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:50:53 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:53Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:53.294 144287 DEBUG eventlet.wsgi.server [-] (144287) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:53.295 144287 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: Accept: */*#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: Connection: close#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: Content-Type: text/plain#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: Host: 169.254.169.254#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: User-Agent: curl/7.84.0#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: X-Forwarded-For: 10.100.0.10#015
Jan 20 09:50:53 np0005588920 ovn_metadata_agent[144123]: X-Ovn-Network-Id: ff283be9-fe7c-4cc6-900d-7258ea771ba5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 20 09:50:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:53.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:53 np0005588920 podman[265442]: 2026-01-20 14:50:53.968237426 +0000 UTC m=+0.056436104 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:50:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:54.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:54 np0005588920 nova_compute[226886]: 2026-01-20 14:50:54.779 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:54 np0005588920 nova_compute[226886]: 2026-01-20 14:50:54.779 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:55 np0005588920 nova_compute[226886]: 2026-01-20 14:50:55.860 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:55.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:56.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:56.840 144287 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 20 09:50:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:56.841 144287 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2546 time: 3.5464902#033[00m
Jan 20 09:50:56 np0005588920 haproxy-metadata-proxy-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264518]: 10.100.0.10:44682 [20/Jan/2026:14:50:53.293] listener listener/metadata 0/0/0/3548/3548 200 2530 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 20 09:50:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.892 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.893 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.893 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.893 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.893 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.895 226890 INFO nova.compute.manager [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Terminating instance#033[00m
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.896 226890 DEBUG nova.compute.manager [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:50:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:57.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:57 np0005588920 nova_compute[226886]: 2026-01-20 14:50:57.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:58.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:58 np0005588920 kernel: taped97bbce-18 (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.7659] device (taped97bbce-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00485|binding|INFO|Releasing lport ed97bbce-18dc-4c9b-9a04-919dd3a45a8e from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00486|binding|INFO|Setting lport ed97bbce-18dc-4c9b-9a04-919dd3a45a8e down in Southbound
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00487|binding|INFO|Removing iface taped97bbce-18 ovn-installed in OVS
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 kernel: tapb194a444-cc (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.8066] device (tapb194a444-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.851 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.856 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:b6:ce 10.100.0.10'], port_security=['fa:16:3e:f7:b6:ce 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.227'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15443dea-d8ce-4297-a7fe-a5cbf38bfa28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00488|binding|INFO|Releasing lport b194a444-cc69-43f2-9931-e9e53ee450c9 from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00489|binding|INFO|Setting lport b194a444-cc69-43f2-9931-e9e53ee450c9 down in Southbound
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.856 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00490|binding|INFO|Removing iface tapb194a444-cc ovn-installed in OVS
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.858 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ed97bbce-18dc-4c9b-9a04-919dd3a45a8e in datapath ff283be9-fe7c-4cc6-900d-7258ea771ba5 unbound from our chassis#033[00m
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.859 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.860 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff283be9-fe7c-4cc6-900d-7258ea771ba5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.862 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b12c6880-df9d-48f5-a6f0-fde650e8ef39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.863 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:e3:2c 10.1.1.14'], port_security=['fa:16:3e:2a:e3:2c 10.1.1.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1149616215', 'neutron:cidrs': '10.1.1.14/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1149616215', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2681bf0f-676e-409e-8d3b-a85f151d084a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b194a444-cc69-43f2-9931-e9e53ee450c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.864 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5 namespace which is not needed anymore#033[00m
Jan 20 09:50:58 np0005588920 kernel: tap10da9204-0c (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.8742] device (tap10da9204-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.877 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00491|binding|INFO|Releasing lport 10da9204-0ccb-45d0-981d-fdff5c41cda1 from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00492|binding|INFO|Setting lport 10da9204-0ccb-45d0-981d-fdff5c41cda1 down in Southbound
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00493|binding|INFO|Removing iface tap10da9204-0c ovn-installed in OVS
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.896 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:cf:d1 10.1.1.179'], port_security=['fa:16:3e:8b:cf:d1 10.1.1.179'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1615926068', 'neutron:cidrs': '10.1.1.179/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1615926068', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2681bf0f-676e-409e-8d3b-a85f151d084a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=10da9204-0ccb-45d0-981d-fdff5c41cda1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:58 np0005588920 kernel: tap62ccfbd3-f5 (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.9049] device (tap62ccfbd3-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.908 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00494|binding|INFO|Releasing lport 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00495|binding|INFO|Setting lport 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 down in Southbound
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00496|binding|INFO|Removing iface tap62ccfbd3-f5 ovn-installed in OVS
Jan 20 09:50:58 np0005588920 kernel: tapd2be8515-19 (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.9239] device (tapd2be8515-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.927 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:d2:6d 10.1.1.81'], port_security=['fa:16:3e:a0:d2:6d 10.1.1.81'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.81/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00497|binding|INFO|Releasing lport d2be8515-193f-43f4-bae4-d2a509320929 from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00498|binding|INFO|Setting lport d2be8515-193f-43f4-bae4-d2a509320929 down in Southbound
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.950 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 kernel: tape10436e2-79 (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00499|binding|INFO|Removing iface tapd2be8515-19 ovn-installed in OVS
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.9560] device (tape10436e2-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 kernel: tapea69e1af-95 (unregistering): left promiscuous mode
Jan 20 09:50:58 np0005588920 NetworkManager[49076]: <info>  [1768920658.9795] device (tapea69e1af-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00500|binding|INFO|Releasing lport e10436e2-7916-4b6b-905e-e9be7cb338b9 from this chassis (sb_readonly=0)
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.988 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00501|binding|INFO|Setting lport e10436e2-7916-4b6b-905e-e9be7cb338b9 down in Southbound
Jan 20 09:50:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:58Z|00502|binding|INFO|Removing iface tape10436e2-79 ovn-installed in OVS
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.990 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:83:a1 10.1.1.252'], port_security=['fa:16:3e:3d:83:a1 10.1.1.252'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.252/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8379fdc8-7594-4ecf-a54b-2f6eb6ad8d77, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d2be8515-193f-43f4-bae4-d2a509320929) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:58 np0005588920 nova_compute[226886]: 2026-01-20 14:50:58.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:58.994 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:16:c9 10.2.2.100'], port_security=['fa:16:3e:c9:16:c9 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76110867-e0cf-4657-99dd-486c8fecc844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0117568c-b7e4-4ec7-b573-7c7e6aecac58, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e10436e2-7916-4b6b-905e-e9be7cb338b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:59Z|00503|binding|INFO|Releasing lport ea69e1af-9543-4c76-9981-b8475aa031fe from this chassis (sb_readonly=0)
Jan 20 09:50:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:59Z|00504|binding|INFO|Setting lport ea69e1af-9543-4c76-9981-b8475aa031fe down in Southbound
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 ovn_controller[133971]: 2026-01-20T14:50:59Z|00505|binding|INFO|Removing iface tapea69e1af-95 ovn-installed in OVS
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.019 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.024 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:77:c5 10.2.2.200'], port_security=['fa:16:3e:35:77:c5 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'f444ccf6-5adb-489a-b174-7450017a351b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76110867-e0cf-4657-99dd-486c8fecc844', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b15c4e6eb57e4b0ca4e63c85ed92fc5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47ff9a81-c98c-4db5-ad99-41dc6ffcd899', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0117568c-b7e4-4ec7-b573-7c7e6aecac58, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ea69e1af-9543-4c76-9981-b8475aa031fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.029 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 20 09:50:59 np0005588920 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006b.scope: Consumed 17.279s CPU time.
Jan 20 09:50:59 np0005588920 systemd-machined[196121]: Machine qemu-47-instance-0000006b terminated.
Jan 20 09:50:59 np0005588920 NetworkManager[49076]: <info>  [1768920659.1279] manager: (tapb194a444-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/251)
Jan 20 09:50:59 np0005588920 NetworkManager[49076]: <info>  [1768920659.1405] manager: (tap10da9204-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/252)
Jan 20 09:50:59 np0005588920 NetworkManager[49076]: <info>  [1768920659.1499] manager: (tap62ccfbd3-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Jan 20 09:50:59 np0005588920 NetworkManager[49076]: <info>  [1768920659.1627] manager: (tapd2be8515-19): new Tun device (/org/freedesktop/NetworkManager/Devices/254)
Jan 20 09:50:59 np0005588920 NetworkManager[49076]: <info>  [1768920659.1863] manager: (tapea69e1af-95): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.206 226890 INFO nova.virt.libvirt.driver [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Instance destroyed successfully.#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.208 226890 DEBUG nova.objects.instance [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lazy-loading 'resources' on Instance uuid f444ccf6-5adb-489a-b174-7450017a351b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.254 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.255 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.255 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.256 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.257 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.257 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped97bbce-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.259 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.277 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.279 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:b6:ce,bridge_name='br-int',has_traffic_filtering=True,id=ed97bbce-18dc-4c9b-9a04-919dd3a45a8e,network=Network(ff283be9-fe7c-4cc6-900d-7258ea771ba5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped97bbce-18')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.280 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.280 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.281 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.281 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.282 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.282 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb194a444-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.284 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.285 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.298 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:e3:2c,bridge_name='br-int',has_traffic_filtering=True,id=b194a444-cc69-43f2-9931-e9e53ee450c9,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb194a444-cc')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.299 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.299 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.300 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.300 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.301 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.301 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10da9204-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.315 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:cf:d1,bridge_name='br-int',has_traffic_filtering=True,id=10da9204-0ccb-45d0-981d-fdff5c41cda1,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap10da9204-0c')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.316 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.316 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.317 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.317 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.318 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.318 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62ccfbd3-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.319 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.330 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.332 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:d2:6d,bridge_name='br-int',has_traffic_filtering=True,id=62ccfbd3-f504-46d0-a4af-ec2dcb7b5764,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62ccfbd3-f5')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.333 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.333 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.334 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.334 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.335 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2be8515-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.343 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.345 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:83:a1,bridge_name='br-int',has_traffic_filtering=True,id=d2be8515-193f-43f4-bae4-d2a509320929,network=Network(cfaa226a-b6e0-41ba-a3f5-d7b004368355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2be8515-19')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.346 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.347 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "address": "fa:16:3e:c9:16:c9", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape10436e2-79", "ovs_interfaceid": "e10436e2-7916-4b6b-905e-e9be7cb338b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.347 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.347 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.348 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.349 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape10436e2-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.350 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.351 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.355 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:16:c9,bridge_name='br-int',has_traffic_filtering=True,id=e10436e2-7916-4b6b-905e-e9be7cb338b9,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape10436e2-79')#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.356 226890 DEBUG nova.virt.libvirt.vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-59996163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-device-tagging-server-59996163',id=107,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGiovI3eWhBuKsex9urSvFX3uKzTSBMdGJM+MZZXdjOxuu2em/kXiVf+3Fw7ODXXJEAuGgn6bWpPSlVWZnY7sGK3DnbQgH5/90LwE2A9ResE+BovU1cWvqEkt55sBmeBLw==',key_name='tempest-keypair-854902004',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b15c4e6eb57e4b0ca4e63c85ed92fc5f',ramdisk_id='',reservation_id='r-9tjmx0bx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-228784294',owner_user_name='tempest-TaggedBootDevicesTest-228784294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d45e7e42e6d419898780db108ff93ff',uuid=f444ccf6-5adb-489a-b174-7450017a351b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.356 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converting VIF {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.357 226890 DEBUG nova.network.os_vif_util [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.357 226890 DEBUG os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.358 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.358 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea69e1af-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.359 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.362 226890 INFO os_vif [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:77:c5,bridge_name='br-int',has_traffic_filtering=True,id=ea69e1af-9543-4c76-9981-b8475aa031fe,network=Network(76110867-e0cf-4657-99dd-486c8fecc844),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea69e1af-95')#033[00m
Jan 20 09:50:59 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [NOTICE]   (264515) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:59 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [NOTICE]   (264515) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:59 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [WARNING]  (264515) : Exiting Master process...
Jan 20 09:50:59 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [ALERT]    (264515) : Current worker (264518) exited with code 143 (Terminated)
Jan 20 09:50:59 np0005588920 neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5[264509]: [WARNING]  (264515) : All workers exited. Exiting... (0)
Jan 20 09:50:59 np0005588920 systemd[1]: libpod-025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61.scope: Deactivated successfully.
Jan 20 09:50:59 np0005588920 podman[265504]: 2026-01-20 14:50:59.45192066 +0000 UTC m=+0.491473631 container died 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:50:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d73cbc4f713f6586fdc8003c7d2ef301cffb202142b5ff8961b3a588c1f5dfeb-merged.mount: Deactivated successfully.
Jan 20 09:50:59 np0005588920 podman[265504]: 2026-01-20 14:50:59.742219663 +0000 UTC m=+0.781772644 container cleanup 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:50:59 np0005588920 systemd[1]: libpod-conmon-025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61.scope: Deactivated successfully.
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.813 226890 INFO nova.virt.libvirt.driver [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Deleting instance files /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b_del#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.814 226890 INFO nova.virt.libvirt.driver [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Deletion of /var/lib/nova/instances/f444ccf6-5adb-489a-b174-7450017a351b_del complete#033[00m
Jan 20 09:50:59 np0005588920 podman[265674]: 2026-01-20 14:50:59.817208213 +0000 UTC m=+0.047478504 container remove 025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.824 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dc26b202-87ba-4ce2-8ca3-5ab75122ae06]: (4, ('Tue Jan 20 02:50:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5 (025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61)\n025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61\nTue Jan 20 02:50:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5 (025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61)\n025fc60e0c754131db8894879bf8a3819fb6dc5f01d317b752cd4d15778efd61\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.825 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e604c4e0-5a6e-4c4a-8e99-2d11ce2104df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.826 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff283be9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:59 np0005588920 kernel: tapff283be9-f0: left promiscuous mode
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.828 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.849 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e9f19a-2429-4713-ac6f-211538615a3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.868 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[211727d2-d483-4dc1-80d1-50dd59b16142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.870 226890 INFO nova.compute.manager [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Took 1.97 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.870 226890 DEBUG oslo.service.loopingcall [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.871 226890 DEBUG nova.compute.manager [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.870 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[628e83b5-5fa5-4b46-afe2-fca9d90e9d53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 nova_compute[226886]: 2026-01-20 14:50:59.871 226890 DEBUG nova.network.neutron [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.891 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[13fdf13f-a5f1-46da-b1dc-a6432505dbff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565461, 'reachable_time': 30164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265690, 'error': None, 'target': 'ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.894 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff283be9-fe7c-4cc6-900d-7258ea771ba5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.894 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[88200d0f-6a5b-49a6-b81f-9c3138e6a4a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.895 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b194a444-cc69-43f2-9931-e9e53ee450c9 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:50:59 np0005588920 systemd[1]: run-netns-ovnmeta\x2dff283be9\x2dfe7c\x2d4cc6\x2d900d\x2d7258ea771ba5.mount: Deactivated successfully.
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.897 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfaa226a-b6e0-41ba-a3f5-d7b004368355, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.898 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[554eef7c-b51b-4870-ae86-b1440513b086]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:50:59.899 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355 namespace which is not needed anymore#033[00m
Jan 20 09:50:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:50:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:59.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [NOTICE]   (264591) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [NOTICE]   (264591) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [WARNING]  (264591) : Exiting Master process...
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [ALERT]    (264591) : Current worker (264593) exited with code 143 (Terminated)
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355[264586]: [WARNING]  (264591) : All workers exited. Exiting... (0)
Jan 20 09:51:00 np0005588920 systemd[1]: libpod-f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4.scope: Deactivated successfully.
Jan 20 09:51:00 np0005588920 podman[265708]: 2026-01-20 14:51:00.037963346 +0000 UTC m=+0.045898770 container died f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:51:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0dc8c79443c86075c67bf9bb3d9c08f01c14647c4fad70ca2f6d5fe0d4dace2e-merged.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 podman[265708]: 2026-01-20 14:51:00.071647015 +0000 UTC m=+0.079582429 container cleanup f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:51:00 np0005588920 systemd[1]: libpod-conmon-f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4.scope: Deactivated successfully.
Jan 20 09:51:00 np0005588920 podman[265736]: 2026-01-20 14:51:00.136993356 +0000 UTC m=+0.042646579 container remove f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.143 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ff564210-3a77-423d-926a-43909cda8a8c]: (4, ('Tue Jan 20 02:50:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355 (f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4)\nf425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4\nTue Jan 20 02:51:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355 (f425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4)\nf425d8b2628e69529ac6f1b2c808868f6f5796de66c92c7692c41e78a6d45ea4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.144 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b42f323-12e2-4b0f-a445-dabe18ff28d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.145 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfaa226a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:00 np0005588920 kernel: tapcfaa226a-b0: left promiscuous mode
Jan 20 09:51:00 np0005588920 nova_compute[226886]: 2026-01-20 14:51:00.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:00 np0005588920 nova_compute[226886]: 2026-01-20 14:51:00.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.163 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a5fcc7-2ca5-47eb-ba33-dd1be62b9e60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.182 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6cea260f-ce50-4754-a9b3-fe39a87afe78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.183 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[53a36849-230a-426f-b724-2c17cca36a58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.196 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c780790b-ff7f-409b-a2aa-fbe8b068dffe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565550, 'reachable_time': 30566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265751, 'error': None, 'target': 'ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.198 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cfaa226a-b6e0-41ba-a3f5-d7b004368355 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.199 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[dc53eca1-a254-49e1-84ab-a05271b12454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.199 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 10da9204-0ccb-45d0-981d-fdff5c41cda1 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.201 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfaa226a-b6e0-41ba-a3f5-d7b004368355, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.201 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f552dd9-586f-4191-980b-d692475ef5b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.201 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.203 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfaa226a-b6e0-41ba-a3f5-d7b004368355, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.203 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[363eb806-0acd-42a7-acd1-1da537725feb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.204 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d2be8515-193f-43f4-bae4-d2a509320929 in datapath cfaa226a-b6e0-41ba-a3f5-d7b004368355 unbound from our chassis#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.205 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfaa226a-b6e0-41ba-a3f5-d7b004368355, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.205 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[28439c72-da55-4056-a604-09340ad0435d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.205 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e10436e2-7916-4b6b-905e-e9be7cb338b9 in datapath 76110867-e0cf-4657-99dd-486c8fecc844 unbound from our chassis#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.207 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76110867-e0cf-4657-99dd-486c8fecc844, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e780c83-b2d0-4a83-a0d4-226ada7aacf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.208 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844 namespace which is not needed anymore#033[00m
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [NOTICE]   (264684) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [NOTICE]   (264684) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [WARNING]  (264684) : Exiting Master process...
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [ALERT]    (264684) : Current worker (264686) exited with code 143 (Terminated)
Jan 20 09:51:00 np0005588920 neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844[264680]: [WARNING]  (264684) : All workers exited. Exiting... (0)
Jan 20 09:51:00 np0005588920 systemd[1]: libpod-5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e.scope: Deactivated successfully.
Jan 20 09:51:00 np0005588920 podman[265769]: 2026-01-20 14:51:00.328627259 +0000 UTC m=+0.042500046 container died 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:51:00 np0005588920 podman[265769]: 2026-01-20 14:51:00.365047434 +0000 UTC m=+0.078920201 container cleanup 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:51:00 np0005588920 systemd[1]: libpod-conmon-5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e.scope: Deactivated successfully.
Jan 20 09:51:00 np0005588920 podman[265800]: 2026-01-20 14:51:00.419299426 +0000 UTC m=+0.036966051 container remove 5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.425 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5c56ac-2a04-4d0d-a175-3ea047b8357a]: (4, ('Tue Jan 20 02:51:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844 (5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e)\n5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e\nTue Jan 20 02:51:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844 (5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e)\n5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.427 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05b438-e144-490a-8488-7eb4c279a235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.428 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76110867-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:00 np0005588920 nova_compute[226886]: 2026-01-20 14:51:00.429 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:00 np0005588920 kernel: tap76110867-e0: left promiscuous mode
Jan 20 09:51:00 np0005588920 nova_compute[226886]: 2026-01-20 14:51:00.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.446 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e87bdd22-04ed-4a54-8728-342f7d34e9d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.473 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd9a1dd-f00c-4bb7-8b5f-6b009fb6246e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.474 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aecc5c51-0f30-483a-ae67-d49630e7888d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.493 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[01898b4c-7424-4c27-b695-162fe59883da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565665, 'reachable_time': 44166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265815, 'error': None, 'target': 'ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.495 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-76110867-e0cf-4657-99dd-486c8fecc844 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.495 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0fde07e3-ac86-4f93-954b-7c6410229495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.496 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ea69e1af-9543-4c76-9981-b8475aa031fe in datapath 76110867-e0cf-4657-99dd-486c8fecc844 unbound from our chassis#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.497 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76110867-e0cf-4657-99dd-486c8fecc844, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:00.498 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[41d1b4ab-d990-425a-a531-973243d1ebf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:00.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay-9a07298257a630114c8a4c22ea64dc59d68781fadc5f960af57c6b4e36e1518f-merged.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a9d6484e563a447fe3635abfc7c55326e0a4556ae2012c83558b440a2a4875e-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 systemd[1]: run-netns-ovnmeta\x2d76110867\x2de0cf\x2d4657\x2d99dd\x2d486c8fecc844.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 systemd[1]: run-netns-ovnmeta\x2dcfaa226a\x2db6e0\x2d41ba\x2da3f5\x2dd7b004368355.mount: Deactivated successfully.
Jan 20 09:51:00 np0005588920 nova_compute[226886]: 2026-01-20 14:51:00.863 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.379 226890 DEBUG nova.compute.manager [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-b194a444-cc69-43f2-9931-e9e53ee450c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.379 226890 DEBUG oslo_concurrency.lockutils [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.380 226890 DEBUG oslo_concurrency.lockutils [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.380 226890 DEBUG oslo_concurrency.lockutils [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.380 226890 DEBUG nova.compute.manager [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-b194a444-cc69-43f2-9931-e9e53ee450c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.380 226890 DEBUG nova.compute.manager [req-a6655175-2b32-4380-bd2c-930c186faa1b req-92922cd9-55ef-469c-8b48-77953f7c972b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-b194a444-cc69-43f2-9931-e9e53ee450c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.443 226890 DEBUG oslo_concurrency.lockutils [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.444 226890 DEBUG oslo_concurrency.lockutils [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.444 226890 DEBUG nova.compute.manager [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.450 226890 DEBUG nova.compute.manager [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.451 226890 DEBUG nova.objects.instance [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.480 226890 DEBUG nova.virt.libvirt.driver [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.508 226890 DEBUG nova.compute.manager [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.508 226890 DEBUG oslo_concurrency.lockutils [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.508 226890 DEBUG oslo_concurrency.lockutils [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.509 226890 DEBUG oslo_concurrency.lockutils [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.509 226890 DEBUG nova.compute.manager [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:01 np0005588920 nova_compute[226886]: 2026-01-20 14:51:01.509 226890 DEBUG nova.compute.manager [req-ded2f584-c24b-4840-8e53-45699b65285b req-bac162b3-5011-4188-8882-bf39ec66a2d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:01.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.172 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.173 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.173 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.173 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.173 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.174 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.174 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.174 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.175 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.175 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.175 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.175 226890 WARNING nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-10da9204-0ccb-45d0-981d-fdff5c41cda1 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.176 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.176 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.176 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.177 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.177 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-ea69e1af-9543-4c76-9981-b8475aa031fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.177 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-ea69e1af-9543-4c76-9981-b8475aa031fe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.177 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.178 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.178 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.178 226890 DEBUG oslo_concurrency.lockutils [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.178 226890 DEBUG nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:02 np0005588920 nova_compute[226886]: 2026-01-20 14:51:02.179 226890 WARNING nova.compute.manager [req-a4716a9a-1d97-4c91-b400-49abdda7f61e req-fa538d9e-0d16-4d66-a0af-9eb289f76d5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-ea69e1af-9543-4c76-9981-b8475aa031fe for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:02.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.465 226890 DEBUG nova.compute.manager [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.465 226890 DEBUG oslo_concurrency.lockutils [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.466 226890 DEBUG oslo_concurrency.lockutils [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.466 226890 DEBUG oslo_concurrency.lockutils [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.466 226890 DEBUG nova.compute.manager [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.467 226890 WARNING nova.compute.manager [req-e346684f-2b88-4bad-b8d1-1ba724ff0b10 req-5c97e69b-f123-42e0-b34c-e61e47700860 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-b194a444-cc69-43f2-9931-e9e53ee450c9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.603 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.604 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.604 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.604 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.605 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.605 226890 WARNING nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.605 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.605 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.605 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.606 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.606 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.606 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.606 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.607 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.607 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.607 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.607 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.607 226890 WARNING nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.608 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.608 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.608 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.608 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.608 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-d2be8515-193f-43f4-bae4-d2a509320929 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.609 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-d2be8515-193f-43f4-bae4-d2a509320929 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.609 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.609 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.609 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.610 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.610 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.610 226890 WARNING nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-d2be8515-193f-43f4-bae4-d2a509320929 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.610 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.610 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.611 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.611 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.611 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-unplugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.612 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-unplugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.612 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.612 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f444ccf6-5adb-489a-b174-7450017a351b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.612 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.613 226890 DEBUG oslo_concurrency.lockutils [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.613 226890 DEBUG nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] No waiting events found dispatching network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:03 np0005588920 nova_compute[226886]: 2026-01-20 14:51:03.613 226890 WARNING nova.compute.manager [req-139efc2e-8c04-4da5-87e9-51d3537f3dc3 req-62e0006f-23c8-400e-94d1-6f010d650651 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received unexpected event network-vif-plugged-e10436e2-7916-4b6b-905e-e9be7cb338b9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:03.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:04 np0005588920 kernel: tap2c9f3e71-25 (unregistering): left promiscuous mode
Jan 20 09:51:04 np0005588920 NetworkManager[49076]: <info>  [1768920664.2452] device (tap2c9f3e71-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:04Z|00506|binding|INFO|Releasing lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 from this chassis (sb_readonly=0)
Jan 20 09:51:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:04Z|00507|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 down in Southbound
Jan 20 09:51:04 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:04Z|00508|binding|INFO|Removing iface tap2c9f3e71-25 ovn-installed in OVS
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.259 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.265 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.267 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.269 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.270 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[22428740-e713-4264-b04c-831b6d1cb3be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.270 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 20 09:51:04 np0005588920 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006f.scope: Consumed 14.388s CPU time.
Jan 20 09:51:04 np0005588920 systemd-machined[196121]: Machine qemu-48-instance-0000006f terminated.
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.360 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [NOTICE]   (265178) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:04 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [NOTICE]   (265178) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:04 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [WARNING]  (265178) : Exiting Master process...
Jan 20 09:51:04 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [ALERT]    (265178) : Current worker (265180) exited with code 143 (Terminated)
Jan 20 09:51:04 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265174]: [WARNING]  (265178) : All workers exited. Exiting... (0)
Jan 20 09:51:04 np0005588920 systemd[1]: libpod-ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469.scope: Deactivated successfully.
Jan 20 09:51:04 np0005588920 podman[265840]: 2026-01-20 14:51:04.459622504 +0000 UTC m=+0.107335833 container died ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.495 226890 INFO nova.virt.libvirt.driver [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.504 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance destroyed successfully.#033[00m
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.504 226890 DEBUG nova.objects.instance [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'numa_topology' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.515 226890 DEBUG nova.compute.manager [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:04 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:04 np0005588920 systemd[1]: var-lib-containers-storage-overlay-8547778e3af0e305b2d9b868cb26df82069846e48ececd61b645d71fe6cd5a72-merged.mount: Deactivated successfully.
Jan 20 09:51:04 np0005588920 podman[265840]: 2026-01-20 14:51:04.555536698 +0000 UTC m=+0.203249977 container cleanup ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.558 226890 DEBUG oslo_concurrency.lockutils [None req-69248774-11bc-475a-9af3-540082bcdd43 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:04 np0005588920 systemd[1]: libpod-conmon-ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469.scope: Deactivated successfully.
Jan 20 09:51:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:04.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:04 np0005588920 podman[265879]: 2026-01-20 14:51:04.788535873 +0000 UTC m=+0.209550252 container remove ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.794 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[29c1e7d6-6863-42c6-b4e7-52418461e92d]: (4, ('Tue Jan 20 02:51:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469)\necd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469\nTue Jan 20 02:51:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ecd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469)\necd43435019f47fc52166494679da4faa4655084f129f35d02a7ef2f73503469\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.795 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[036890ab-980b-4ac1-92b0-ab7c440a206b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.796 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.798 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:51:04 np0005588920 nova_compute[226886]: 2026-01-20 14:51:04.815 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.818 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc36a46-e471-4cc6-adaf-4e8302e33988]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.833 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[64266dca-9c65-44e2-8930-07bd5ba10bc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.834 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4b99a1-f4fb-4e34-9996-931a250d4062]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.858 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[00a95321-afc6-447e-88d1-ca27e5750ffb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568103, 'reachable_time': 20270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265898, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.861 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:04 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:51:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:04.861 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[e02c18b6-901b-4259-942c-feba712132dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.604 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-deleted-e10436e2-7916-4b6b-905e-e9be7cb338b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.605 226890 INFO nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Neutron deleted interface e10436e2-7916-4b6b-905e-e9be7cb338b9; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.605 226890 DEBUG nova.network.neutron [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2be8515-193f-43f4-bae4-d2a509320929", "address": "fa:16:3e:3d:83:a1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2be8515-19", "ovs_interfaceid": "d2be8515-193f-43f4-bae4-d2a509320929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.638 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Detach interface failed, port_id=e10436e2-7916-4b6b-905e-e9be7cb338b9, reason: Instance f444ccf6-5adb-489a-b174-7450017a351b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.639 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-deleted-d2be8515-193f-43f4-bae4-d2a509320929 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.639 226890 INFO nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Neutron deleted interface d2be8515-193f-43f4-bae4-d2a509320929; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.639 226890 DEBUG nova.network.neutron [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [{"id": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "address": "fa:16:3e:f7:b6:ce", "network": {"id": "ff283be9-fe7c-4cc6-900d-7258ea771ba5", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1025807292-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped97bbce-18", "ovs_interfaceid": "ed97bbce-18dc-4c9b-9a04-919dd3a45a8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b194a444-cc69-43f2-9931-e9e53ee450c9", "address": "fa:16:3e:2a:e3:2c", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb194a444-cc", "ovs_interfaceid": "b194a444-cc69-43f2-9931-e9e53ee450c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "address": "fa:16:3e:8b:cf:d1", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.179", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10da9204-0c", "ovs_interfaceid": "10da9204-0ccb-45d0-981d-fdff5c41cda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "address": "fa:16:3e:a0:d2:6d", "network": {"id": "cfaa226a-b6e0-41ba-a3f5-d7b004368355", "bridge": "br-int", "label": "tempest-device-tagging-net1-1605565951", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62ccfbd3-f5", "ovs_interfaceid": "62ccfbd3-f504-46d0-a4af-ec2dcb7b5764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea69e1af-9543-4c76-9981-b8475aa031fe", "address": "fa:16:3e:35:77:c5", "network": {"id": "76110867-e0cf-4657-99dd-486c8fecc844", "bridge": "br-int", "label": "tempest-device-tagging-net2-1836923782", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b15c4e6eb57e4b0ca4e63c85ed92fc5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea69e1af-95", "ovs_interfaceid": "ea69e1af-9543-4c76-9981-b8475aa031fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.662 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Detach interface failed, port_id=d2be8515-193f-43f4-bae4-d2a509320929, reason: Instance f444ccf6-5adb-489a-b174-7450017a351b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.663 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.663 226890 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.663 226890 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.664 226890 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.664 226890 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.664 226890 WARNING nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state stopped and task_state None.#033[00m
Jan 20 09:51:05 np0005588920 nova_compute[226886]: 2026-01-20 14:51:05.864 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:05.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:06.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.164 226890 DEBUG nova.network.neutron [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.188 226890 INFO nova.compute.manager [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Took 7.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.509 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.551 226890 DEBUG oslo_concurrency.lockutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.552 226890 DEBUG oslo_concurrency.lockutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.552 226890 DEBUG nova.network.neutron [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.553 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'info_cache' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.740 226890 DEBUG nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-deleted-ea69e1af-9543-4c76-9981-b8475aa031fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.744 226890 DEBUG nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.744 226890 DEBUG oslo_concurrency.lockutils [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.744 226890 DEBUG oslo_concurrency.lockutils [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.745 226890 DEBUG oslo_concurrency.lockutils [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.745 226890 DEBUG nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.745 226890 WARNING nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.746 226890 DEBUG nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-deleted-62ccfbd3-f504-46d0-a4af-ec2dcb7b5764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.746 226890 DEBUG nova.compute.manager [req-bedd4aca-a2ab-48ce-a638-d340a14ef916 req-3773b4aa-c642-4261-b180-8443989dd539 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Received event network-vif-deleted-ed97bbce-18dc-4c9b-9a04-919dd3a45a8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:07.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:07 np0005588920 nova_compute[226886]: 2026-01-20 14:51:07.988 226890 INFO nova.compute.manager [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Took 0.80 seconds to detach 3 volumes for instance.#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.030 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.030 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.101 226890 DEBUG oslo_concurrency.processutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2039283088' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.515 226890 DEBUG oslo_concurrency.processutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.523 226890 DEBUG nova.compute.provider_tree [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.615 226890 DEBUG nova.scheduler.client.report [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.655 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.683 226890 INFO nova.scheduler.client.report [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Deleted allocations for instance f444ccf6-5adb-489a-b174-7450017a351b#033[00m
Jan 20 09:51:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:08.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:08 np0005588920 nova_compute[226886]: 2026-01-20 14:51:08.829 226890 DEBUG oslo_concurrency.lockutils [None req-89361a6e-9088-4e8e-a1d4-8e2944ce1503 1d45e7e42e6d419898780db108ff93ff b15c4e6eb57e4b0ca4e63c85ed92fc5f - - default default] Lock "f444ccf6-5adb-489a-b174-7450017a351b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.268 226890 DEBUG nova.network.neutron [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.291 226890 DEBUG oslo_concurrency.lockutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.319 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance destroyed successfully.#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.320 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'numa_topology' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.332 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.341 226890 DEBUG nova.virt.libvirt.vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.342 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.343 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.343 226890 DEBUG os_vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.345 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c9f3e71-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.348 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.351 226890 INFO os_vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25')#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.357 226890 DEBUG nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start _get_guest_xml network_info=[{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.361 226890 WARNING nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.365 226890 DEBUG nova.virt.libvirt.host [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.366 226890 DEBUG nova.virt.libvirt.host [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.368 226890 DEBUG nova.virt.libvirt.host [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.369 226890 DEBUG nova.virt.libvirt.host [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.370 226890 DEBUG nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.370 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.371 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.371 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.371 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.371 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.371 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.372 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.372 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.372 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.372 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.372 226890 DEBUG nova.virt.hardware [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.373 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.389 226890 DEBUG oslo_concurrency.processutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2607352360' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.821 226890 DEBUG oslo_concurrency.processutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:09 np0005588920 nova_compute[226886]: 2026-01-20 14:51:09.850 226890 DEBUG oslo_concurrency.processutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:09.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2098571758' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.275 226890 DEBUG oslo_concurrency.processutils [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.278 226890 DEBUG nova.virt.libvirt.vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.279 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.280 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.282 226890 DEBUG nova.objects.instance [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.300 226890 DEBUG nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <uuid>91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</uuid>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <name>instance-0000006f</name>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestJSON-server-1254775729</nova:name>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:51:09</nova:creationTime>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <nova:port uuid="2c9f3e71-2562-4ae0-bf22-d56553a40405">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="serial">91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="uuid">91701d8b-36b9-42fe-a5ae-bf6c9c74fc14</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_disk.config">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cb:72:0c"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <target dev="tap2c9f3e71-25"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14/console.log" append="off"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:51:10 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:51:10 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:51:10 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:51:10 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.302 226890 DEBUG nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.302 226890 DEBUG nova.virt.libvirt.driver [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.304 226890 DEBUG nova.virt.libvirt.vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.304 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.306 226890 DEBUG nova.network.os_vif_util [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.306 226890 DEBUG os_vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.308 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.309 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.310 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.314 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.314 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c9f3e71-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.315 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c9f3e71-25, col_values=(('external_ids', {'iface-id': '2c9f3e71-2562-4ae0-bf22-d56553a40405', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:72:0c', 'vm-uuid': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.3185] manager: (tap2c9f3e71-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.322 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.323 226890 INFO os_vif [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25')#033[00m
Jan 20 09:51:10 np0005588920 kernel: tap2c9f3e71-25: entered promiscuous mode
Jan 20 09:51:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:10Z|00509|binding|INFO|Claiming lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 for this chassis.
Jan 20 09:51:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:10Z|00510|binding|INFO|2c9f3e71-2562-4ae0-bf22-d56553a40405: Claiming fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.4251] manager: (tap2c9f3e71-25): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.423 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.430 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.432 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.434 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:51:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:10Z|00511|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 ovn-installed in OVS
Jan 20 09:51:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:10Z|00512|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 up in Southbound
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.453 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[32da3312-24df-4865-9cb1-45c1f75bddb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.454 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.456 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.456 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a02361f4-3a0f-43b5-a4f4-d22eeac8e31f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.458 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3978b5-b8ea-4471-8dc7-6f35c77ec07a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 systemd-machined[196121]: New machine qemu-49-instance-0000006f.
Jan 20 09:51:10 np0005588920 systemd-udevd[266000]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.477 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6d861471-72f4-455d-898b-de9e0e358c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 systemd[1]: Started Virtual Machine qemu-49-instance-0000006f.
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.4920] device (tap2c9f3e71-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.4933] device (tap2c9f3e71-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.494 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef99111-fdb5-494d-80f3-7c0917f76bad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.534 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[25a34d15-2d93-4aed-9823-9c225ec21485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 systemd-udevd[266003]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.542 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7fde096e-a835-47ff-9d31-c7465e1921f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.5451] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/258)
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.580 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[241e8a85-2113-4bde-b1a2-54b6abd70991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.583 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba1d610-e9e5-498b-adfd-bec7388c641a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.6156] device (tap762e1859-40): carrier: link connected
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.622 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[faf81604-5ed1-4408-b108-bb01104179b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.640 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e09c269c-cc09-4424-bd14-bd8db57faf67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571230, 'reachable_time': 18663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266031, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.657 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78dbf9a7-7d2f-4d3b-bc37-d9bc1ddc76ac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571230, 'tstamp': 571230}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266032, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.673 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[50c53865-a8fa-4b27-9030-0290ea888874]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571230, 'reachable_time': 18663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266033, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.706 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fe11ebb6-2aba-4e48-b271-918e35a0f650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:10.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.762 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dfdb55-168e-46a1-864f-5dd24c7dfb4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.763 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.763 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.764 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 NetworkManager[49076]: <info>  [1768920670.7666] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 20 09:51:10 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.768 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.769 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.769 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:10Z|00513|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.782 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.783 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6f52eb-d243-4aae-adef-268844964b80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.784 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:10.787 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:10 np0005588920 nova_compute[226886]: 2026-01-20 14:51:10.867 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.076 226890 DEBUG nova.compute.manager [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.076 226890 DEBUG oslo_concurrency.lockutils [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.076 226890 DEBUG oslo_concurrency.lockutils [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.076 226890 DEBUG oslo_concurrency.lockutils [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.077 226890 DEBUG nova.compute.manager [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.077 226890 WARNING nova.compute.manager [req-9b8f1a1f-2b87-4949-a471-d3d3011e3db1 req-d825ac00-13da-4561-8a3b-f12303d3030e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 09:51:11 np0005588920 podman[266065]: 2026-01-20 14:51:11.12075278 +0000 UTC m=+0.044886302 container create fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:51:11 np0005588920 systemd[1]: Started libpod-conmon-fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500.scope.
Jan 20 09:51:11 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:11 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cef76c54ace23e9244a0b0c451b15407e22f12b9b785e7f3e0cbef7100de0a7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:11 np0005588920 podman[266065]: 2026-01-20 14:51:11.096399021 +0000 UTC m=+0.020532553 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:11 np0005588920 podman[266065]: 2026-01-20 14:51:11.198224579 +0000 UTC m=+0.122358111 container init fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:51:11 np0005588920 podman[266065]: 2026-01-20 14:51:11.203172847 +0000 UTC m=+0.127306349 container start fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:51:11 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [NOTICE]   (266101) : New worker (266112) forked
Jan 20 09:51:11 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [NOTICE]   (266101) : Loading success.
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.386 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.388 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920671.385845, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.388 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.392 226890 DEBUG nova.compute.manager [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.395 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance rebooted successfully.#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.396 226890 DEBUG nova.compute.manager [None req-b996dbc6-f98a-4c66-a2d8-76d123f83f9f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.422 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.425 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.463 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.464 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920671.387326, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.464 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.483 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:11 np0005588920 nova_compute[226886]: 2026-01-20 14:51:11.487 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:11.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:12.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.224 226890 DEBUG nova.compute.manager [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.225 226890 DEBUG oslo_concurrency.lockutils [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.226 226890 DEBUG oslo_concurrency.lockutils [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.226 226890 DEBUG oslo_concurrency.lockutils [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.226 226890 DEBUG nova.compute.manager [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:13 np0005588920 nova_compute[226886]: 2026-01-20 14:51:13.227 226890 WARNING nova.compute.manager [req-2fa09998-9fca-41ea-b04b-b8e530f27617 req-ab7ca5ed-6e22-4f3c-80c1-b92f4b7d7591 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:13.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.203 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920659.2027454, f444ccf6-5adb-489a-b174-7450017a351b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.204 226890 INFO nova.compute.manager [-] [instance: f444ccf6-5adb-489a-b174-7450017a351b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.223 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.224 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.248 226890 DEBUG nova.compute.manager [None req-71ce919d-9f1f-442a-99b4-633ae41b151f - - - - - -] [instance: f444ccf6-5adb-489a-b174-7450017a351b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.272 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.371 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.372 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.379 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.379 226890 INFO nova.compute.claims [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.533 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:14.945 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:14.946 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:51:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1484244040' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.946 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.967 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.974 226890 DEBUG nova.compute.provider_tree [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:14 np0005588920 nova_compute[226886]: 2026-01-20 14:51:14.992 226890 DEBUG nova.scheduler.client.report [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:15 np0005588920 podman[266158]: 2026-01-20 14:51:15.003957948 +0000 UTC m=+0.089519356 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.021 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.022 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.183 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.183 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.258 226890 INFO nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.302 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.611 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.613 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.614 226890 INFO nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Creating image(s)#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.637 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.661 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.683 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.685 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.724 226890 DEBUG nova.policy [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd85d286ce6224326a0f4a15a06afbfea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.746 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.746 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.747 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.747 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.767 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.770 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:15 np0005588920 nova_compute[226886]: 2026-01-20 14:51:15.916 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:15.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.091 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.155 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] resizing rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.239 226890 DEBUG nova.objects.instance [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.281 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.282 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Ensure instance console log exists: /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.283 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.283 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.284 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:16.452 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:16.453 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:16.454 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:16 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:16Z|00514|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:16 np0005588920 nova_compute[226886]: 2026-01-20 14:51:16.707 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Successfully created port: 362a0992-4e48-4999-a396-29fc2957fa09 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:51:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:16.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:17.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.015 226890 DEBUG nova.objects.instance [None req-74c71026-218b-406b-b946-5e61ad5602e8 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.064 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920678.0645528, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.065 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.077 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Successfully updated port: 362a0992-4e48-4999-a396-29fc2957fa09 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.183 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.185 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.185 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.185 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.190 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.282 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.461 226890 DEBUG nova.compute.manager [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-changed-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.462 226890 DEBUG nova.compute.manager [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Refreshing instance network info cache due to event network-changed-362a0992-4e48-4999-a396-29fc2957fa09. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.462 226890 DEBUG oslo_concurrency.lockutils [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.489 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:51:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:51:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/35703789' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:51:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:51:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/35703789' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:51:18 np0005588920 kernel: tap2c9f3e71-25 (unregistering): left promiscuous mode
Jan 20 09:51:18 np0005588920 NetworkManager[49076]: <info>  [1768920678.5826] device (tap2c9f3e71-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.589 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:18Z|00515|binding|INFO|Releasing lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 from this chassis (sb_readonly=0)
Jan 20 09:51:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:18Z|00516|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 down in Southbound
Jan 20 09:51:18 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:18Z|00517|binding|INFO|Removing iface tap2c9f3e71-25 ovn-installed in OVS
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.592 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.600 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.601 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.602 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.603 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[25d0736e-23ae-4e54-85f7-8ae546770714]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.604 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 20 09:51:18 np0005588920 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Consumed 7.932s CPU time.
Jan 20 09:51:18 np0005588920 systemd-machined[196121]: Machine qemu-49-instance-0000006f terminated.
Jan 20 09:51:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:18.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:18 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [NOTICE]   (266101) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:18 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [NOTICE]   (266101) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:18 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [WARNING]  (266101) : Exiting Master process...
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.768 226890 DEBUG nova.compute.manager [None req-74c71026-218b-406b-b946-5e61ad5602e8 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:18 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [ALERT]    (266101) : Current worker (266112) exited with code 143 (Terminated)
Jan 20 09:51:18 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266081]: [WARNING]  (266101) : All workers exited. Exiting... (0)
Jan 20 09:51:18 np0005588920 systemd[1]: libpod-fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500.scope: Deactivated successfully.
Jan 20 09:51:18 np0005588920 podman[266380]: 2026-01-20 14:51:18.779468465 +0000 UTC m=+0.083611412 container died fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:51:18 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:18 np0005588920 systemd[1]: var-lib-containers-storage-overlay-cef76c54ace23e9244a0b0c451b15407e22f12b9b785e7f3e0cbef7100de0a7a-merged.mount: Deactivated successfully.
Jan 20 09:51:18 np0005588920 podman[266380]: 2026-01-20 14:51:18.820320163 +0000 UTC m=+0.124463090 container cleanup fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:51:18 np0005588920 systemd[1]: libpod-conmon-fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500.scope: Deactivated successfully.
Jan 20 09:51:18 np0005588920 podman[266417]: 2026-01-20 14:51:18.877516088 +0000 UTC m=+0.037717753 container remove fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.882 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[71397ae3-aec1-40cb-9772-abf36ed0b5cb]: (4, ('Tue Jan 20 02:51:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500)\nfab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500\nTue Jan 20 02:51:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (fab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500)\nfab9b23d20a4c274c8b3737a3aa8ae8d41e76243099e28688085734eeba04500\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.884 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2071d989-832b-417c-ae9b-54f50f9a9c5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.884 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.898 226890 DEBUG nova.compute.manager [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.899 226890 DEBUG oslo_concurrency.lockutils [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.900 226890 DEBUG oslo_concurrency.lockutils [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.901 226890 DEBUG oslo_concurrency.lockutils [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.901 226890 DEBUG nova.compute.manager [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.902 226890 WARNING nova.compute.manager [req-de738b80-26b5-494c-9e23-8c8fa703e4f6 req-6588c20f-ac81-4ef0-abe2-c495e690a56e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state suspended and task_state None.#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.904 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 nova_compute[226886]: 2026-01-20 14:51:18.906 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.907 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[96704719-1a01-41aa-8008-b979f5d52b70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.923 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fcce8cd2-a6b5-41ae-a5f7-343db04fdadb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.924 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5bca22-ea85-40c6-967c-ffc846873123]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.938 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e202b189-7bc6-402a-8ae0-3a4a92b10902]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571220, 'reachable_time': 24091, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266436, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:18 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.940 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:18.940 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[12ad578c-f2fb-4955-ba8f-cfbea25a1209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.650 226890 DEBUG nova.network.neutron [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.683 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.684 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance network_info: |[{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.685 226890 DEBUG oslo_concurrency.lockutils [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.685 226890 DEBUG nova.network.neutron [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Refreshing network info cache for port 362a0992-4e48-4999-a396-29fc2957fa09 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.691 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start _get_guest_xml network_info=[{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.698 226890 WARNING nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.706 226890 DEBUG nova.virt.libvirt.host [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.707 226890 DEBUG nova.virt.libvirt.host [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.715 226890 DEBUG nova.virt.libvirt.host [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.716 226890 DEBUG nova.virt.libvirt.host [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.717 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.718 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.718 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.718 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.718 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.719 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.719 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.719 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.719 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.719 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.720 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.720 226890 DEBUG nova.virt.hardware [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:19 np0005588920 nova_compute[226886]: 2026-01-20 14:51:19.722 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.027 226890 INFO nova.compute.manager [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Resuming#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.028 226890 DEBUG nova.objects.instance [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'flavor' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.104 226890 DEBUG oslo_concurrency.lockutils [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.104 226890 DEBUG oslo_concurrency.lockutils [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.105 226890 DEBUG nova.network.neutron [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2416694841' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2416694841' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2798121711' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.282 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.320 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.324 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:20.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3218029730' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.776 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.780 226890 DEBUG nova.virt.libvirt.vif [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1832777585',display_name='tempest-ServerStableDeviceRescueTest-server-1832777585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1832777585',id=112,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-hox96xwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:15Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.781 226890 DEBUG nova.network.os_vif_util [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.783 226890 DEBUG nova.network.os_vif_util [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.785 226890 DEBUG nova.objects.instance [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.813 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <uuid>7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</uuid>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <name>instance-00000070</name>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1832777585</nova:name>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:51:19</nova:creationTime>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <nova:port uuid="362a0992-4e48-4999-a396-29fc2957fa09">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="serial">7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="uuid">7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:83:1f:c0"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <target dev="tap362a0992-4e"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/console.log" append="off"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:51:20 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:51:20 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:51:20 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:51:20 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.814 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Preparing to wait for external event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.814 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.815 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.815 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.816 226890 DEBUG nova.virt.libvirt.vif [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1832777585',display_name='tempest-ServerStableDeviceRescueTest-server-1832777585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1832777585',id=112,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-hox96xwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:15Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.817 226890 DEBUG nova.network.os_vif_util [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.818 226890 DEBUG nova.network.os_vif_util [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.819 226890 DEBUG os_vif [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.820 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.821 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.825 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.826 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap362a0992-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.826 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap362a0992-4e, col_values=(('external_ids', {'iface-id': '362a0992-4e48-4999-a396-29fc2957fa09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:1f:c0', 'vm-uuid': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588920 NetworkManager[49076]: <info>  [1768920680.8303] manager: (tap362a0992-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.831 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.838 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.840 226890 INFO os_vif [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e')#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.907 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.907 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.908 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:83:1f:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.909 226890 INFO nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Using config drive#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.945 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:20.948 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:20 np0005588920 nova_compute[226886]: 2026-01-20 14:51:20.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.036 226890 DEBUG nova.compute.manager [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.037 226890 DEBUG oslo_concurrency.lockutils [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.037 226890 DEBUG oslo_concurrency.lockutils [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.038 226890 DEBUG oslo_concurrency.lockutils [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.038 226890 DEBUG nova.compute.manager [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.038 226890 WARNING nova.compute.manager [req-580508bd-b039-4018-9cd2-46302e0086b6 req-cdc66ab4-5f90-40a2-8118-b43b3473f041 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.392 226890 DEBUG nova.network.neutron [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updated VIF entry in instance network info cache for port 362a0992-4e48-4999-a396-29fc2957fa09. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.393 226890 DEBUG nova.network.neutron [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.428 226890 DEBUG oslo_concurrency.lockutils [req-3b8a970e-65b5-4101-91eb-643d759fc8ed req-76ba2e7c-b415-46a8-b3a3-7afe30d7040f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.472 226890 INFO nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Creating config drive at /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.476 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuxtrf_t4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.620 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuxtrf_t4" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.661 226890 DEBUG nova.storage.rbd_utils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.666 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:21.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.938 226890 DEBUG nova.network.neutron [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [{"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.958 226890 DEBUG oslo_concurrency.lockutils [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.965 226890 DEBUG nova.virt.libvirt.vif [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.966 226890 DEBUG nova.network.os_vif_util [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.966 226890 DEBUG nova.network.os_vif_util [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.967 226890 DEBUG os_vif [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.967 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.968 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.968 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.970 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c9f3e71-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.970 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c9f3e71-25, col_values=(('external_ids', {'iface-id': '2c9f3e71-2562-4ae0-bf22-d56553a40405', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:72:0c', 'vm-uuid': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.971 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.971 226890 INFO os_vif [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25')#033[00m
Jan 20 09:51:21 np0005588920 nova_compute[226886]: 2026-01-20 14:51:21.994 226890 DEBUG nova.objects.instance [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'numa_topology' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.065 226890 DEBUG oslo_concurrency.processutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.066 226890 INFO nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Deleting local config drive /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:51:22 np0005588920 kernel: tap2c9f3e71-25: entered promiscuous mode
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.0851] manager: (tap2c9f3e71-25): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00518|binding|INFO|Claiming lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 for this chassis.
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00519|binding|INFO|2c9f3e71-2562-4ae0-bf22-d56553a40405: Claiming fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.100 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.102 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.103 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00520|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 ovn-installed in OVS
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00521|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 up in Southbound
Jan 20 09:51:22 np0005588920 systemd-udevd[266576]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.122 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbec797-866b-4b4b-b8e8-3bf154ffdf55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.123 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.1270] device (tap2c9f3e71-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.1297] device (tap2c9f3e71-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.130 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.130 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cc855411-92f7-4da4-863e-66fda4daffee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.131 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[effbee1c-9659-4eca-9912-de78ae0025d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 kernel: tap362a0992-4e: entered promiscuous mode
Jan 20 09:51:22 np0005588920 systemd-udevd[266587]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.1370] manager: (tap362a0992-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.138 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00522|binding|INFO|Claiming lport 362a0992-4e48-4999-a396-29fc2957fa09 for this chassis.
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00523|binding|INFO|362a0992-4e48-4999-a396-29fc2957fa09: Claiming fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.149 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[f72f02cc-35d9-4047-9170-b03557851aad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 systemd-machined[196121]: New machine qemu-50-instance-0000006f.
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00524|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 ovn-installed in OVS
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.1600] device (tap362a0992-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.1608] device (tap362a0992-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00525|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 up in Southbound
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.161 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.163 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45bb6936-ac2b-4748-b300-8468e264fd92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.167 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 systemd[1]: Started Virtual Machine qemu-50-instance-0000006f.
Jan 20 09:51:22 np0005588920 systemd-machined[196121]: New machine qemu-51-instance-00000070.
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.201 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bca9c17d-9b84-41fd-b38c-ab76dd8d8d11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 systemd[1]: Started Virtual Machine qemu-51-instance-00000070.
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.206 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2008a2-b2bd-46f1-bb3e-ff387e982781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.2074] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.241 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[978cb914-280b-4c85-ab5b-f805d358ea32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.245 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c1942e-f6a3-440a-8044-b5f70495841a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.2663] device (tap762e1859-40): carrier: link connected
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.272 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4fff6abd-423e-4e21-9688-b5ad69777b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.287 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[04c2f3e5-b9c4-4f3a-8033-5bfe8be82fee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572395, 'reachable_time': 23327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266629, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.307 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6812b1-e950-49f4-946f-67102f36984b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572395, 'tstamp': 572395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266631, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.321 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30182e2d-ef1b-4f0e-83b5-0cecd0784acd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572395, 'reachable_time': 23327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266632, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.354 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[66ea4100-16a7-4c96-bec7-6c703718700a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.421 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[126f0d87-b310-4c9c-b5ea-7a5102836e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.422 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.423 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.423 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.425 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:51:22 np0005588920 NetworkManager[49076]: <info>  [1768920682.4261] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.433 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.434 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:22Z|00526|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.449 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.449 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.454 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[443bbd84-d49f-4409-95fb-8858e934524e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.455 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:22.457 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.465 226890 DEBUG nova.compute.manager [req-24c47fe0-4574-4014-8cf9-8200b499b8aa req-50fec8d0-e83c-4fed-b7f6-99410af1befc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.466 226890 DEBUG oslo_concurrency.lockutils [req-24c47fe0-4574-4014-8cf9-8200b499b8aa req-50fec8d0-e83c-4fed-b7f6-99410af1befc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.466 226890 DEBUG oslo_concurrency.lockutils [req-24c47fe0-4574-4014-8cf9-8200b499b8aa req-50fec8d0-e83c-4fed-b7f6-99410af1befc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.467 226890 DEBUG oslo_concurrency.lockutils [req-24c47fe0-4574-4014-8cf9-8200b499b8aa req-50fec8d0-e83c-4fed-b7f6-99410af1befc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.467 226890 DEBUG nova.compute.manager [req-24c47fe0-4574-4014-8cf9-8200b499b8aa req-50fec8d0-e83c-4fed-b7f6-99410af1befc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Processing event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:51:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.656 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.657 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920682.6561444, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.657 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.677 226890 DEBUG nova.compute.manager [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.678 226890 DEBUG nova.objects.instance [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.682 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.688 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.698 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance running successfully.#033[00m
Jan 20 09:51:22 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.707 226890 DEBUG nova.virt.libvirt.guest [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.708 226890 DEBUG nova.compute.manager [None req-8e00b004-ef93-4102-ac69-7eb6ce068196 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.720 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.720 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920682.6637292, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.721 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:51:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:22.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.748 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:22 np0005588920 nova_compute[226886]: 2026-01-20 14:51:22.752 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:22 np0005588920 podman[266706]: 2026-01-20 14:51:22.868661245 +0000 UTC m=+0.060533658 container create eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:51:22 np0005588920 systemd[1]: Started libpod-conmon-eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596.scope.
Jan 20 09:51:22 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:22 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f14188490562c0f8a0c96a324dbe61dda8c6a16a7e5016a4d6913ba3977a70e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:22 np0005588920 podman[266706]: 2026-01-20 14:51:22.838755551 +0000 UTC m=+0.030627974 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:22 np0005588920 podman[266706]: 2026-01-20 14:51:22.949641942 +0000 UTC m=+0.141514355 container init eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:51:22 np0005588920 podman[266706]: 2026-01-20 14:51:22.955597018 +0000 UTC m=+0.147469441 container start eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:51:22 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [NOTICE]   (266725) : New worker (266727) forked
Jan 20 09:51:22 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [NOTICE]   (266725) : Loading success.
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.014 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.016 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.024 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26fedf5a-909a-48b0-a90a-52206a1d61f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.025 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.028 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.028 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d5055c80-a23c-4b4c-9bb0-7c42cbfc5ab8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.029 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[398e883d-c33d-40e4-90ce-3454de2c36ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.041 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6456b709-cbcf-4745-8d33-c4b7c57e62d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.062 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[66081d0b-df47-4328-a977-889364d46bde]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.089 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[172081f7-b037-4116-b633-e4a72cf2b8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.096 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a88b2463-8eec-4371-9472-a6115c19ca23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 systemd-udevd[266619]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:23 np0005588920 NetworkManager[49076]: <info>  [1768920683.0977] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.135 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7952b82b-0460-4823-a444-d44daed78100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.139 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4d26a9-4ce0-4275-b16a-e9a0511cc65c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 NetworkManager[49076]: <info>  [1768920683.1669] device (tap79184781-10): carrier: link connected
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.171 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[873d0fad-3bc1-4812-a588-52fae1e325f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.188 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[85c31de5-b6e3-47c3-a951-cd90434e6aeb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572485, 'reachable_time': 20250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266772, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0cda4b9f-dc5d-4f01-9f80-de223589f981]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572485, 'tstamp': 572485}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266781, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.228 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d79674-7884-4fbc-9a1f-d6e92281195f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572485, 'reachable_time': 20250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266785, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.260 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f119a3a2-47ce-4904-911b-bd3c25f86dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.308 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[91cf5e3a-083c-4959-b7f2-bd3713bf54b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.308 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.309 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920683.3086882, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.309 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.310 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.310 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.310 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.312 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:23 np0005588920 NetworkManager[49076]: <info>  [1768920683.3136] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 20 09:51:23 np0005588920 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.314 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:23Z|00527|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.316 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.320 226890 INFO nova.virt.libvirt.driver [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance spawned successfully.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.320 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.329 226890 DEBUG nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.329 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.330 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.330 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.330 226890 DEBUG nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.330 226890 WARNING nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.331 226890 DEBUG nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.331 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.331 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.331 226890 DEBUG oslo_concurrency.lockutils [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.331 226890 DEBUG nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.332 226890 WARNING nova.compute.manager [req-ae3ee24b-fd6b-4e03-8977-7eee5c8284be req-d5f0bbe6-e0b3-45ef-b04e-051d9ceebaae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.332 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.334 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.334 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e6cb2b4d-fdf5-4c83-b3a6-de1342a7b9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.335 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:23.336 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.339 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.341 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.342 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.342 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.342 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.343 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.343 226890 DEBUG nova.virt.libvirt.driver [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.372 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.373 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920683.3094425, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.373 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.397 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.399 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920683.3150775, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.400 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.408 226890 INFO nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Took 7.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.408 226890 DEBUG nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.418 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.420 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.447 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.474 226890 INFO nova.compute.manager [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Took 9.13 seconds to build instance.#033[00m
Jan 20 09:51:23 np0005588920 nova_compute[226886]: 2026-01-20 14:51:23.496 226890 DEBUG oslo_concurrency.lockutils [None req-9ad077f3-3d2e-41b2-834e-7db53e363247 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:23 np0005588920 podman[266821]: 2026-01-20 14:51:23.655449148 +0000 UTC m=+0.023925958 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:23 np0005588920 podman[266821]: 2026-01-20 14:51:23.749913639 +0000 UTC m=+0.118390439 container create 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:51:23 np0005588920 systemd[1]: Started libpod-conmon-177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167.scope.
Jan 20 09:51:23 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66c35a5a166c46ed33af94f34adc99656e782d03ab2ef8e292480c40aa5fc27f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:23 np0005588920 podman[266821]: 2026-01-20 14:51:23.85317664 +0000 UTC m=+0.221653470 container init 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:51:23 np0005588920 podman[266821]: 2026-01-20 14:51:23.858901429 +0000 UTC m=+0.227378229 container start 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:51:23 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [NOTICE]   (266840) : New worker (266842) forked
Jan 20 09:51:23 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [NOTICE]   (266840) : Loading success.
Jan 20 09:51:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:23.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.614 226890 DEBUG nova.compute.manager [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.616 226890 DEBUG oslo_concurrency.lockutils [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.616 226890 DEBUG oslo_concurrency.lockutils [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.616 226890 DEBUG oslo_concurrency.lockutils [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.617 226890 DEBUG nova.compute.manager [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.617 226890 WARNING nova.compute.manager [req-2caa280f-e8d5-4396-a140-04d15ceb9c99 req-bb03a7b0-0952-44a1-9ca6-e4555758b381 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:24.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.861 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.862 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.862 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.862 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.863 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.864 226890 INFO nova.compute.manager [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Terminating instance#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.865 226890 DEBUG nova.compute.manager [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:51:24 np0005588920 kernel: tap2c9f3e71-25 (unregistering): left promiscuous mode
Jan 20 09:51:24 np0005588920 NetworkManager[49076]: <info>  [1768920684.9024] device (tap2c9f3e71-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:24Z|00528|binding|INFO|Releasing lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 from this chassis (sb_readonly=0)
Jan 20 09:51:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:24Z|00529|binding|INFO|Setting lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 down in Southbound
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.910 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:24Z|00530|binding|INFO|Removing iface tap2c9f3e71-25 ovn-installed in OVS
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.913 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:24.918 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:24.919 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:51:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:24.921 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:24.922 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b56f83-8606-4b66-8f94-11bbabb4a237]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:24.925 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:51:24 np0005588920 nova_compute[226886]: 2026-01-20 14:51:24.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:24 np0005588920 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 20 09:51:24 np0005588920 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006f.scope: Consumed 2.704s CPU time.
Jan 20 09:51:24 np0005588920 systemd-machined[196121]: Machine qemu-50-instance-0000006f terminated.
Jan 20 09:51:24 np0005588920 podman[266851]: 2026-01-20 14:51:24.992907541 +0000 UTC m=+0.065805715 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [NOTICE]   (266725) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [NOTICE]   (266725) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [WARNING]  (266725) : Exiting Master process...
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [WARNING]  (266725) : Exiting Master process...
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [ALERT]    (266725) : Current worker (266727) exited with code 143 (Terminated)
Jan 20 09:51:25 np0005588920 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266721]: [WARNING]  (266725) : All workers exited. Exiting... (0)
Jan 20 09:51:25 np0005588920 systemd[1]: libpod-eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596.scope: Deactivated successfully.
Jan 20 09:51:25 np0005588920 podman[266889]: 2026-01-20 14:51:25.053421368 +0000 UTC m=+0.042803144 container died eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:51:25 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:25 np0005588920 kernel: tap2c9f3e71-25: entered promiscuous mode
Jan 20 09:51:25 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0f14188490562c0f8a0c96a324dbe61dda8c6a16a7e5016a4d6913ba3977a70e-merged.mount: Deactivated successfully.
Jan 20 09:51:25 np0005588920 NetworkManager[49076]: <info>  [1768920685.0854] manager: (tap2c9f3e71-25): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 20 09:51:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:25Z|00531|binding|INFO|Claiming lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 for this chassis.
Jan 20 09:51:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:25Z|00532|binding|INFO|2c9f3e71-2562-4ae0-bf22-d56553a40405: Claiming fa:16:3e:cb:72:0c 10.100.0.14
Jan 20 09:51:25 np0005588920 kernel: tap2c9f3e71-25 (unregistering): left promiscuous mode
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.096 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:25 np0005588920 podman[266889]: 2026-01-20 14:51:25.10555626 +0000 UTC m=+0.094938036 container cleanup eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.107 226890 INFO nova.virt.libvirt.driver [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Instance destroyed successfully.#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.108 226890 DEBUG nova.objects.instance [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:25 np0005588920 systemd[1]: libpod-conmon-eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596.scope: Deactivated successfully.
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.124 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.126 226890 DEBUG nova.virt.libvirt.vif [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1254775729',display_name='tempest-ServerActionsTestJSON-server-1254775729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1254775729',id=111,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-f09hwh77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=91701d8b-36b9-42fe-a5ae-bf6c9c74fc14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.126 226890 DEBUG nova.network.os_vif_util [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "address": "fa:16:3e:cb:72:0c", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c9f3e71-25", "ovs_interfaceid": "2c9f3e71-2562-4ae0-bf22-d56553a40405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:25Z|00533|binding|INFO|Releasing lport 2c9f3e71-2562-4ae0-bf22-d56553a40405 from this chassis (sb_readonly=0)
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.127 226890 DEBUG nova.network.os_vif_util [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.127 226890 DEBUG os_vif [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.129 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.129 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c9f3e71-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.130 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.138 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:72:0c 10.100.0.14'], port_security=['fa:16:3e:cb:72:0c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '91701d8b-36b9-42fe-a5ae-bf6c9c74fc14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '9', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=2c9f3e71-2562-4ae0-bf22-d56553a40405) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.143 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.146 226890 INFO os_vif [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=2c9f3e71-2562-4ae0-bf22-d56553a40405,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c9f3e71-25')#033[00m
Jan 20 09:51:25 np0005588920 podman[266920]: 2026-01-20 14:51:25.183595246 +0000 UTC m=+0.043544875 container remove eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.189 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f81d2c70-6ff1-471a-b194-9bcbab2fc0bc]: (4, ('Tue Jan 20 02:51:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596)\need52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596\nTue Jan 20 02:51:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (eed52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596)\need52451f70360ddef9bb114a1dc606e806713a3315d49579349a90e7ab2d596\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.191 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e78680d2-24ce-4332-9cf9-29803bb9eabd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.192 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.194 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.208 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.212 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[620ac582-079b-46ce-a978-545c829ba232]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.229 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e3500103-0e3f-40cb-b37e-246f7162b213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.230 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4beaf696-7208-42f5-a9e0-3097e079b511]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:25Z|00534|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.245 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.260 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39660d3e-c535-4712-b056-ce66a602f721]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572387, 'reachable_time': 27756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266955, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.265 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.266 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f7d772-f876-41a0-9884-519f63742527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.266 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.268 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.269 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1284a0fc-87ef-4d2e-9ae6-e02c67d85f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.270 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 2c9f3e71-2562-4ae0-bf22-d56553a40405 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.271 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:25.273 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d6cd0f-b699-4310-a741-d28b0cebe30d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:25Z|00535|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.479 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.618 226890 INFO nova.virt.libvirt.driver [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Deleting instance files /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_del#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.619 226890 INFO nova.virt.libvirt.driver [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Deletion of /var/lib/nova/instances/91701d8b-36b9-42fe-a5ae-bf6c9c74fc14_del complete#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.704 226890 DEBUG nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.704 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.705 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.706 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.706 226890 DEBUG nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.707 226890 DEBUG nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-unplugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.707 226890 DEBUG nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.709 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.709 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.710 226890 DEBUG oslo_concurrency.lockutils [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.711 226890 DEBUG nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] No waiting events found dispatching network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.711 226890 WARNING nova.compute.manager [req-6e59d4fb-7795-4649-8a2c-43df59fa1a77 req-90ace9ef-b176-46f3-995b-77d7a6cbed11 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received unexpected event network-vif-plugged-2c9f3e71-2562-4ae0-bf22-d56553a40405 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.722 226890 INFO nova.compute.manager [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.724 226890 DEBUG oslo.service.loopingcall [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.725 226890 DEBUG nova.compute.manager [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.726 226890 DEBUG nova.network.neutron [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:51:25 np0005588920 nova_compute[226886]: 2026-01-20 14:51:25.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:25.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.031 226890 DEBUG nova.compute.manager [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.074 226890 INFO nova.compute.manager [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] instance snapshotting#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.454 226890 INFO nova.virt.libvirt.driver [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Beginning live snapshot process#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.580 226890 DEBUG nova.virt.libvirt.imagebackend [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:51:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:26.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.864 226890 DEBUG nova.storage.rbd_utils [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(5841d8a50cda4158baa23d1c4bfbacfb) on rbd image(7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.894 226890 DEBUG nova.network.neutron [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.910 226890 INFO nova.compute.manager [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.967 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:26 np0005588920 nova_compute[226886]: 2026-01-20 14:51:26.968 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.046 226890 DEBUG oslo_concurrency.processutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.078 226890 DEBUG nova.storage.rbd_utils [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk@5841d8a50cda4158baa23d1c4bfbacfb to images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.221 226890 DEBUG nova.storage.rbd_utils [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] flattening images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.437 226890 DEBUG nova.storage.rbd_utils [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] removing snapshot(5841d8a50cda4158baa23d1c4bfbacfb) on rbd image(7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:51:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/890546304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.495 226890 DEBUG oslo_concurrency.processutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.501 226890 DEBUG nova.compute.provider_tree [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.536 226890 DEBUG nova.scheduler.client.report [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.579 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.618 226890 INFO nova.scheduler.client.report [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Deleted allocations for instance 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.688 226890 DEBUG oslo_concurrency.lockutils [None req-9e42fbe4-070c-4644-bdaa-b461546f0e4d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "91701d8b-36b9-42fe-a5ae-bf6c9c74fc14" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:27 np0005588920 nova_compute[226886]: 2026-01-20 14:51:27.785 226890 DEBUG nova.compute.manager [req-a2123559-42d7-49c4-acf4-059e9ed28f31 req-c8540b68-37c2-450d-a2d7-0101e96b7c79 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Received event network-vif-deleted-2c9f3e71-2562-4ae0-bf22-d56553a40405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:27.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 20 09:51:28 np0005588920 nova_compute[226886]: 2026-01-20 14:51:28.063 226890 DEBUG nova.storage.rbd_utils [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(snap) on rbd image(ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:51:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:28.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 20 09:51:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:29.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:30 np0005588920 nova_compute[226886]: 2026-01-20 14:51:30.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:30.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:30 np0005588920 nova_compute[226886]: 2026-01-20 14:51:30.805 226890 INFO nova.virt.libvirt.driver [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Snapshot image upload complete#033[00m
Jan 20 09:51:30 np0005588920 nova_compute[226886]: 2026-01-20 14:51:30.805 226890 INFO nova.compute.manager [None req-3bde963b-b0d2-4a85-8277-e7debe2ee4dd d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Took 4.73 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:51:30 np0005588920 nova_compute[226886]: 2026-01-20 14:51:30.927 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:31.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:32.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:33.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:34 np0005588920 nova_compute[226886]: 2026-01-20 14:51:34.143 226890 INFO nova.compute.manager [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Rescuing#033[00m
Jan 20 09:51:34 np0005588920 nova_compute[226886]: 2026-01-20 14:51:34.143 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:34 np0005588920 nova_compute[226886]: 2026-01-20 14:51:34.143 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:34 np0005588920 nova_compute[226886]: 2026-01-20 14:51:34.144 226890 DEBUG nova.network.neutron [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:34.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:35 np0005588920 nova_compute[226886]: 2026-01-20 14:51:35.926 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.053 226890 DEBUG nova.network.neutron [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.075 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.082 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.082 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.082 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:36.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:36 np0005588920 nova_compute[226886]: 2026-01-20 14:51:36.868 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:51:36 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:36Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:51:36 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:36Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:51:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:37.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:38.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.021 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.049 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.049 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.049 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:39 np0005588920 kernel: tap362a0992-4e (unregistering): left promiscuous mode
Jan 20 09:51:39 np0005588920 NetworkManager[49076]: <info>  [1768920699.8574] device (tap362a0992-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.920 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:51:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:39Z|00536|binding|INFO|Releasing lport 362a0992-4e48-4999-a396-29fc2957fa09 from this chassis (sb_readonly=0)
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.920 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:39Z|00537|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 down in Southbound
Jan 20 09:51:39 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:39Z|00538|binding|INFO|Removing iface tap362a0992-4e ovn-installed in OVS
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.922 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:39.926 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:39.928 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:51:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:39.929 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:39.930 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab8925c-5c93-40b6-b1d4-c5ec95581f56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:39.931 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:51:39 np0005588920 nova_compute[226886]: 2026-01-20 14:51:39.937 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:39.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:39 np0005588920 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 20 09:51:39 np0005588920 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000070.scope: Consumed 14.691s CPU time.
Jan 20 09:51:39 np0005588920 systemd-machined[196121]: Machine qemu-51-instance-00000070 terminated.
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [NOTICE]   (266840) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [NOTICE]   (266840) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [WARNING]  (266840) : Exiting Master process...
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [WARNING]  (266840) : Exiting Master process...
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [ALERT]    (266840) : Current worker (266842) exited with code 143 (Terminated)
Jan 20 09:51:40 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[266836]: [WARNING]  (266840) : All workers exited. Exiting... (0)
Jan 20 09:51:40 np0005588920 systemd[1]: libpod-177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167.scope: Deactivated successfully.
Jan 20 09:51:40 np0005588920 podman[267144]: 2026-01-20 14:51:40.050811338 +0000 UTC m=+0.040724188 container died 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:51:40 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:40 np0005588920 systemd[1]: var-lib-containers-storage-overlay-66c35a5a166c46ed33af94f34adc99656e782d03ab2ef8e292480c40aa5fc27f-merged.mount: Deactivated successfully.
Jan 20 09:51:40 np0005588920 podman[267144]: 2026-01-20 14:51:40.084084955 +0000 UTC m=+0.073997825 container cleanup 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:51:40 np0005588920 systemd[1]: libpod-conmon-177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167.scope: Deactivated successfully.
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.103 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920685.1014354, 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.103 226890 INFO nova.compute.manager [-] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.121 226890 DEBUG nova.compute.manager [None req-9056a90e-f088-4ae1-8e06-364fbc457355 - - - - - -] [instance: 91701d8b-36b9-42fe-a5ae-bf6c9c74fc14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.136 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.137 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.149 226890 INFO nova.virt.libvirt.driver [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance destroyed successfully.#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.149 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:40 np0005588920 podman[267174]: 2026-01-20 14:51:40.155623781 +0000 UTC m=+0.048867338 container remove 177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.160 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[33a098ac-af27-4d16-96ef-66b23b62a35d]: (4, ('Tue Jan 20 02:51:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167)\n177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167\nTue Jan 20 02:51:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167)\n177a2d65588afa5f9821a00e72d1cf9b268e79fdff77ac6cd5a5e814951e2167\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.163 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f83e6f-bf69-400d-972a-a38963f38db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.164 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.166 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:40 np0005588920 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.170 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Attempting a stable device rescue#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.182 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.184 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78615f58-c351-4b85-b997-2330d98dfb08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.199 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[75da7215-84bc-4792-a4bc-bb14c1c4f06c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.200 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfd80f4-71db-461b-b890-ba37f759312d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.217 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[53de83df-804a-4705-8242-9e21016a0386]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572477, 'reachable_time': 41777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267202, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.220 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:40.220 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a1ea53-5af0-434b-abed-cfa464b390d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:40 np0005588920 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.300 226890 DEBUG nova.compute.manager [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.300 226890 DEBUG oslo_concurrency.lockutils [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.300 226890 DEBUG oslo_concurrency.lockutils [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.300 226890 DEBUG oslo_concurrency.lockutils [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.301 226890 DEBUG nova.compute.manager [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.301 226890 WARNING nova.compute.manager [req-832cf56e-5c11-4d8b-ad79-10a09982e479 req-c98b501d-efbd-4cd8-a55b-e5ff78b09d4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.656 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.660 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.661 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Creating image(s)#033[00m
Jan 20 09:51:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:40.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.783 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.786 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.787 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.821 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.842 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.845 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "8d322d741942590afd18efd9dc678ba29b0416d3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.846 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "8d322d741942590afd18efd9dc678ba29b0416d3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.849 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.850 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.850 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.850 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.850 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:40 np0005588920 nova_compute[226886]: 2026-01-20 14:51:40.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.173 226890 DEBUG nova.virt.libvirt.imagebackend [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.221 226890 DEBUG nova.virt.libvirt.imagebackend [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.221 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning images/ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c@snap to None/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:51:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2869754477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.268 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.364 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.365 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.423 226890 DEBUG oslo_concurrency.lockutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "8d322d741942590afd18efd9dc678ba29b0416d3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.467 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.480 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.482 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start _get_guest_xml network_info=[{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:83:1f:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'ae5aa156-02f2-4cbe-8c3c-a41c7fd78b5c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.482 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.497 226890 WARNING nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.504 226890 DEBUG nova.virt.libvirt.host [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.504 226890 DEBUG nova.virt.libvirt.host [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.507 226890 DEBUG nova.virt.libvirt.host [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.507 226890 DEBUG nova.virt.libvirt.host [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.509 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.509 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.510 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.510 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.510 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.511 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.511 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.511 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.512 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.512 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.512 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.513 226890 DEBUG nova.virt.hardware [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.513 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.533 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.632 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.634 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4422MB free_disk=20.90787124633789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.634 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.634 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.692 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.693 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.693 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.727 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:41.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/166196684' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:41 np0005588920 nova_compute[226886]: 2026-01-20 14:51:41.968 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.002 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/29521595' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.162 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.168 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.195 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.267 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.268 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2877049591' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.455 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.456 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.487 226890 DEBUG nova.compute.manager [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.488 226890 DEBUG oslo_concurrency.lockutils [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.488 226890 DEBUG oslo_concurrency.lockutils [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.488 226890 DEBUG oslo_concurrency.lockutils [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.489 226890 DEBUG nova.compute.manager [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.489 226890 WARNING nova.compute.manager [req-37e248b1-265f-4d1c-ab0c-abb46ea8d9f3 req-d998bc36-5895-4711-b220-a4d03b080e22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:42.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2407416010' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.870 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.871 226890 DEBUG nova.virt.libvirt.vif [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1832777585',display_name='tempest-ServerStableDeviceRescueTest-server-1832777585',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1832777585',id=112,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:51:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-hox96xwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:30Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:83:1f:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.872 226890 DEBUG nova.network.os_vif_util [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:83:1f:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.873 226890 DEBUG nova.network.os_vif_util [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.874 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.897 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <uuid>7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</uuid>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <name>instance-00000070</name>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1832777585</nova:name>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:51:41</nova:creationTime>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <nova:port uuid="362a0992-4e48-4999-a396-29fc2957fa09">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="serial">7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="uuid">7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.rescue">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <target dev="sdb" bus="usb"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <boot order="1"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:83:1f:c0"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <target dev="tap362a0992-4e"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/console.log" append="off"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:51:42 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:51:42 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:51:42 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:51:42 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.904 226890 INFO nova.virt.libvirt.driver [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance destroyed successfully.#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.954 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.954 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.954 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.955 226890 DEBUG nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:83:1f:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.955 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Using config drive#033[00m
Jan 20 09:51:42 np0005588920 nova_compute[226886]: 2026-01-20 14:51:42.978 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.002 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.046 226890 DEBUG nova.objects.instance [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'keypairs' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.206 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.207 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.207 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.605 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Creating config drive at /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.609 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphho31eti execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.739 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphho31eti" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.764 226890 DEBUG nova.storage.rbd_utils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588920 nova_compute[226886]: 2026-01-20 14:51:43.768 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.096 226890 DEBUG oslo_concurrency.processutils [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.329s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.097 226890 INFO nova.virt.libvirt.driver [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Deleting local config drive /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 09:51:44 np0005588920 kernel: tap362a0992-4e: entered promiscuous mode
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.1475] manager: (tap362a0992-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/268)
Jan 20 09:51:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:44Z|00539|binding|INFO|Claiming lport 362a0992-4e48-4999-a396-29fc2957fa09 for this chassis.
Jan 20 09:51:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:44Z|00540|binding|INFO|362a0992-4e48-4999-a396-29fc2957fa09: Claiming fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.154 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.155 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.156 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.169 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a411d5fb-38d7-4213-820a-a81456ebd543]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.170 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:44Z|00541|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 up in Southbound
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.171 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.171 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[497872a3-3ac6-42e7-8138-00e7242eb78b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:44Z|00542|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 ovn-installed in OVS
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.172 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.172 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[67dcb35e-b391-45ff-b2b5-64f74aa61686]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 systemd-udevd[267544]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.180 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 systemd-machined[196121]: New machine qemu-52-instance-00000070.
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.1876] device (tap362a0992-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.1885] device (tap362a0992-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.189 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[61e68933-0fb6-4f71-b3bf-934ba237b06b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 systemd[1]: Started Virtual Machine qemu-52-instance-00000070.
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.211 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf17310-33e3-4a68-8b3b-1a98649ff99f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.243 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[abe5d91d-3177-4469-9031-7e6499caced9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.2508] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.249 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[40860b36-feca-4336-b5e2-cbad32e91efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.284 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5d034de1-a835-4ceb-8cf9-0404ec145735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.288 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6f989854-a3f0-4cba-9abd-78ba54e87fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.3135] device (tap79184781-10): carrier: link connected
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.319 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[895d3d5a-2497-4790-a517-ef161a4f60ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.338 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e0c9b6-3dcb-4551-9a30-69f991afac6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574599, 'reachable_time': 43023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267577, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.354 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dae9af1e-b2fc-48e6-afbc-446b48bf7140]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574599, 'tstamp': 574599}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267578, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.371 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c7a951-e975-4d16-a109-2be39884a983]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574599, 'reachable_time': 43023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267579, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.405 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9f26a6a9-e3cc-4126-b18d-d631e5a830a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.462 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c0577aa4-4dfb-47b5-a7cf-a29f7989414b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.463 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.463 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.464 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.465 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:51:44 np0005588920 NetworkManager[49076]: <info>  [1768920704.4661] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.468 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.471 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:44Z|00543|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.472 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.483 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b666eeed-7bf9-4073-84a7-205040ae913e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.483 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:44.484 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.490 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.610 226890 DEBUG nova.compute.manager [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.610 226890 DEBUG oslo_concurrency.lockutils [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.610 226890 DEBUG oslo_concurrency.lockutils [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.611 226890 DEBUG oslo_concurrency.lockutils [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.611 226890 DEBUG nova.compute.manager [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.611 226890 WARNING nova.compute.manager [req-bf14409e-59d6-47b8-9f5d-a94275e3e096 req-78067c5c-8fad-44b6-b2c5-ddf5dc2fddab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:51:44 np0005588920 nova_compute[226886]: 2026-01-20 14:51:44.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:44.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:44 np0005588920 podman[267611]: 2026-01-20 14:51:44.857449371 +0000 UTC m=+0.046534931 container create c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:51:44 np0005588920 systemd[1]: Started libpod-conmon-c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b.scope.
Jan 20 09:51:44 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:44 np0005588920 podman[267611]: 2026-01-20 14:51:44.833014273 +0000 UTC m=+0.022099863 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:44 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0078a81d35150b9e73e264890387800c3108bc45c7d16dead1145c15ff5c574b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:44 np0005588920 podman[267611]: 2026-01-20 14:51:44.945334097 +0000 UTC m=+0.134419687 container init c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 09:51:44 np0005588920 podman[267611]: 2026-01-20 14:51:44.951700846 +0000 UTC m=+0.140786416 container start c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:51:44 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [NOTICE]   (267630) : New worker (267632) forked
Jan 20 09:51:44 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [NOTICE]   (267630) : Loading success.
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.137 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.277 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.278 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920705.2766314, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.278 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.284 226890 DEBUG nova.compute.manager [None req-0d2329d7-d842-48b6-bffa-a4d6660d9973 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.331 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.337 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.374 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.375 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920705.2789006, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.375 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.401 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.406 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:45 np0005588920 nova_compute[226886]: 2026-01-20 14:51:45.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:46 np0005588920 podman[267701]: 2026-01-20 14:51:46.002947578 +0000 UTC m=+0.084044429 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:51:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2312341428' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.555 226890 INFO nova.compute.manager [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Unrescuing#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.556 226890 DEBUG oslo_concurrency.lockutils [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.556 226890 DEBUG oslo_concurrency.lockutils [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.556 226890 DEBUG nova.network.neutron [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.703 226890 DEBUG nova.compute.manager [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.704 226890 DEBUG oslo_concurrency.lockutils [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.704 226890 DEBUG oslo_concurrency.lockutils [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.704 226890 DEBUG oslo_concurrency.lockutils [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.705 226890 DEBUG nova.compute.manager [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.705 226890 WARNING nova.compute.manager [req-7916f7ba-a4f7-4384-9f48-8e6d211666af req-c1c5d9cc-11c1-44ad-8fcd-b15f4f56a50f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:51:46 np0005588920 nova_compute[226886]: 2026-01-20 14:51:46.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:46.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:47 np0005588920 nova_compute[226886]: 2026-01-20 14:51:47.896 226890 DEBUG nova.network.neutron [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:47 np0005588920 nova_compute[226886]: 2026-01-20 14:51:47.918 226890 DEBUG oslo_concurrency.lockutils [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:47 np0005588920 nova_compute[226886]: 2026-01-20 14:51:47.919 226890 DEBUG nova.objects.instance [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:47.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:47 np0005588920 kernel: tap362a0992-4e (unregistering): left promiscuous mode
Jan 20 09:51:47 np0005588920 NetworkManager[49076]: <info>  [1768920707.9945] device (tap362a0992-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00544|binding|INFO|Releasing lport 362a0992-4e48-4999-a396-29fc2957fa09 from this chassis (sb_readonly=0)
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00545|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 down in Southbound
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00546|binding|INFO|Removing iface tap362a0992-4e ovn-installed in OVS
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.004 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.016 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.018 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.019 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.021 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e75b7597-eccb-4b41-b433-e792339d0239]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.021 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 20 09:51:48 np0005588920 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000070.scope: Consumed 3.680s CPU time.
Jan 20 09:51:48 np0005588920 systemd-machined[196121]: Machine qemu-52-instance-00000070 terminated.
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [NOTICE]   (267630) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [NOTICE]   (267630) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [WARNING]  (267630) : Exiting Master process...
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [WARNING]  (267630) : Exiting Master process...
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [ALERT]    (267630) : Current worker (267632) exited with code 143 (Terminated)
Jan 20 09:51:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267626]: [WARNING]  (267630) : All workers exited. Exiting... (0)
Jan 20 09:51:48 np0005588920 systemd[1]: libpod-c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b.scope: Deactivated successfully.
Jan 20 09:51:48 np0005588920 podman[267751]: 2026-01-20 14:51:48.141433314 +0000 UTC m=+0.040800741 container died c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:48 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0078a81d35150b9e73e264890387800c3108bc45c7d16dead1145c15ff5c574b-merged.mount: Deactivated successfully.
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.176 226890 INFO nova.virt.libvirt.driver [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance destroyed successfully.#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.176 226890 DEBUG nova.objects.instance [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:48 np0005588920 podman[267751]: 2026-01-20 14:51:48.17751476 +0000 UTC m=+0.076882167 container cleanup c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:51:48 np0005588920 systemd[1]: libpod-conmon-c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b.scope: Deactivated successfully.
Jan 20 09:51:48 np0005588920 podman[267790]: 2026-01-20 14:51:48.240051041 +0000 UTC m=+0.040557643 container remove c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.246 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[841cbd62-c017-43b8-9fa0-d09773849418]: (4, ('Tue Jan 20 02:51:48 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b)\nc6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b\nTue Jan 20 02:51:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (c6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b)\nc6c0251cb3cb975bcc7c94109bfd207afed0b6a223ffe706ccb34dbd3292549b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.248 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[79c86e7b-f901-43c1-a40d-0baf7dc55c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.249 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.2572] manager: (tap362a0992-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Jan 20 09:51:48 np0005588920 systemd-udevd[267730]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 kernel: tap362a0992-4e: entered promiscuous mode
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.2700] device (tap362a0992-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00547|binding|INFO|Claiming lport 362a0992-4e48-4999-a396-29fc2957fa09 for this chassis.
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.2714] device (tap362a0992-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00548|binding|INFO|362a0992-4e48-4999-a396-29fc2957fa09: Claiming fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.270 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1a7353-773c-487f-85fa-993ce21390eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 systemd-machined[196121]: New machine qemu-53-instance-00000070.
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.288 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[205893d5-ac0a-4d77-b12b-934a19b132fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.289 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d207d0fb-3351-4289-b88f-c47982cb72c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00549|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 ovn-installed in OVS
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.293 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 systemd[1]: Started Virtual Machine qemu-53-instance-00000070.
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00550|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 up in Southbound
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.303 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.305 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0f63e9eb-e4cd-4ef9-884e-6c6713e42a96]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574592, 'reachable_time': 28348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267818, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.307 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:48 np0005588920 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.308 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[03d6e3ed-0252-457e-875d-f1d916345601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.309 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.310 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.320 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c84886-2a4f-4b14-9342-b56a901fa08a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.321 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.324 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.324 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0b641637-b2d4-4fda-bf7b-2f94000945ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.325 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c2fe95-3b3a-47cb-8a93-bc98065c5709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.335 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[f7725d2a-04cb-46ae-9968-b997df081bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.358 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c8be6e-63e1-4ea9-860f-74fff0ed1a61]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.388 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[fb610b85-6719-49c9-b183-efea2ad65213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d90386c9-b845-4453-92bc-81acf98b5f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.3998] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/272)
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.429 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d6513f7b-940d-46ee-9264-c74e7f81300d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.432 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[91ea97e6-6658-43bb-8660-11aadf29abc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.4503] device (tap79184781-10): carrier: link connected
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.457 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ec97f81b-f2ab-473c-88a0-f93349fbf295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.474 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[244a142e-c32c-4a88-8ef3-45e09b2495c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575013, 'reachable_time': 35587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267849, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.489 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c96bac-661f-44c6-bad3-36d82988d890]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575013, 'tstamp': 575013}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267850, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.505 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca728a6-bad7-4c62-a615-f4e62029556f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575013, 'reachable_time': 35587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267851, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.531 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ba082f-a1a3-4f71-8c30-9a29195501ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.583 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[81dd0ee4-8ade-4b9b-a686-ccb9f7bedeaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.584 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.584 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.585 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.586 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 NetworkManager[49076]: <info>  [1768920708.5872] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 20 09:51:48 np0005588920 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.590 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.591 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:48Z|00551|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.608 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.609 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[adf6e5c0-8d7f-44d2-a5da-5746378b1650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.609 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:48.610 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:48.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.852 226890 DEBUG nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.853 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.854 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.854 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.854 226890 DEBUG nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.854 226890 WARNING nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.855 226890 DEBUG nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.855 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.855 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.855 226890 DEBUG oslo_concurrency.lockutils [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.855 226890 DEBUG nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:48 np0005588920 nova_compute[226886]: 2026-01-20 14:51:48.856 226890 WARNING nova.compute.manager [req-62c94edd-98e8-4226-aabc-133ee28005a5 req-216f66bc-728f-4b06-98b0-7295afb9c186 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:51:48 np0005588920 podman[267883]: 2026-01-20 14:51:48.964018844 +0000 UTC m=+0.049045242 container create cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 09:51:49 np0005588920 systemd[1]: Started libpod-conmon-cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67.scope.
Jan 20 09:51:49 np0005588920 podman[267883]: 2026-01-20 14:51:48.938565117 +0000 UTC m=+0.023591535 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:49 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:49 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afbbe09682e2542425445b6878ca6fc4cfb68106ae1f735574f10d3acdca50c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:49 np0005588920 podman[267883]: 2026-01-20 14:51:49.063014883 +0000 UTC m=+0.148041301 container init cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 20 09:51:49 np0005588920 podman[267883]: 2026-01-20 14:51:49.070546835 +0000 UTC m=+0.155573233 container start cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:51:49 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [NOTICE]   (267902) : New worker (267904) forked
Jan 20 09:51:49 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [NOTICE]   (267902) : Loading success.
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.433 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.434 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920709.4329967, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.434 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.457 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.463 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.483 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.483 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920709.4364202, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.483 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.503 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.506 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.552 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:51:49 np0005588920 nova_compute[226886]: 2026-01-20 14:51:49.806 226890 DEBUG nova.compute.manager [None req-b14f2ddc-e98b-4d9c-8fd6-36acf70918d2 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:49.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.023772) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710023824, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2497, "num_deletes": 258, "total_data_size": 5544581, "memory_usage": 5603288, "flush_reason": "Manual Compaction"}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710064131, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3643008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44501, "largest_seqno": 46993, "table_properties": {"data_size": 3632724, "index_size": 6522, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22189, "raw_average_key_size": 21, "raw_value_size": 3611937, "raw_average_value_size": 3430, "num_data_blocks": 281, "num_entries": 1053, "num_filter_entries": 1053, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920523, "oldest_key_time": 1768920523, "file_creation_time": 1768920710, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 40400 microseconds, and 8439 cpu microseconds.
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.064170) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3643008 bytes OK
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.064186) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.066813) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.066827) EVENT_LOG_v1 {"time_micros": 1768920710066822, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.066843) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5533444, prev total WAL file size 5533444, number of live WAL files 2.
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.067962) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3557KB)], [87(8750KB)]
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710068028, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 12603663, "oldest_snapshot_seqno": -1}
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7211 keys, 10794874 bytes, temperature: kUnknown
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710184954, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 10794874, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10746976, "index_size": 28778, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18053, "raw_key_size": 185891, "raw_average_key_size": 25, "raw_value_size": 10618356, "raw_average_value_size": 1472, "num_data_blocks": 1139, "num_entries": 7211, "num_filter_entries": 7211, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920710, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.185278) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 10794874 bytes
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.186272) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 107.7 rd, 92.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.5 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(6.4) write-amplify(3.0) OK, records in: 7742, records dropped: 531 output_compression: NoCompression
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.186290) EVENT_LOG_v1 {"time_micros": 1768920710186281, "job": 54, "event": "compaction_finished", "compaction_time_micros": 117038, "compaction_time_cpu_micros": 24456, "output_level": 6, "num_output_files": 1, "total_output_size": 10794874, "num_input_records": 7742, "num_output_records": 7211, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710186912, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710188131, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.067857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.188169) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.188173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.188175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.188177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:51:50.188178) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:50.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.818 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.819 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.835 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.917 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.918 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.928 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.928 226890 INFO nova.compute.claims [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.935 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.954 226890 DEBUG nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.955 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.955 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.955 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.955 226890 DEBUG nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 WARNING nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 DEBUG nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 DEBUG oslo_concurrency.lockutils [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.956 226890 DEBUG nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:50 np0005588920 nova_compute[226886]: 2026-01-20 14:51:50.957 226890 WARNING nova.compute.manager [req-d46d7551-da32-497b-af87-9cb9e95cf263 req-63d9d7f2-ae36-4923-9ed4-9f79a526787f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.131 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2764376824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.549 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.555 226890 DEBUG nova.compute.provider_tree [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.582 226890 DEBUG nova.scheduler.client.report [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.608 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.609 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.654 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.655 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.688 226890 INFO nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.703 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.827 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.828 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.829 226890 INFO nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Creating image(s)#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.852 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.879 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.904 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.908 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.943 226890 DEBUG nova.policy [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa11200594ea436083cff39ea4deb712', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '49b57d772e7445cb96a38758bdb38839', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:51:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:51.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.976 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.977 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.978 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:51 np0005588920 nova_compute[226886]: 2026-01-20 14:51:51.978 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.001 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.005 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.243 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.326 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] resizing rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.433 226890 DEBUG nova.objects.instance [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lazy-loading 'migration_context' on Instance uuid 39b00621-bcfb-4aab-b143-42e00c43c0dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.447 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.448 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Ensure instance console log exists: /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.448 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.448 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.448 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:52 np0005588920 nova_compute[226886]: 2026-01-20 14:51:52.582 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Successfully created port: e29d0d86-36df-4494-820c-f61e5065ddc3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:51:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:52.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:52 np0005588920 podman[268335]: 2026-01-20 14:51:52.855726355 +0000 UTC m=+0.053032645 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 20 09:51:52 np0005588920 podman[268335]: 2026-01-20 14:51:52.949251699 +0000 UTC m=+0.146557949 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.387 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Successfully updated port: e29d0d86-36df-4494-820c-f61e5065ddc3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.655 226890 DEBUG nova.compute.manager [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-changed-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.655 226890 DEBUG nova.compute.manager [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Refreshing instance network info cache due to event network-changed-e29d0d86-36df-4494-820c-f61e5065ddc3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.656 226890 DEBUG oslo_concurrency.lockutils [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.656 226890 DEBUG oslo_concurrency.lockutils [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.656 226890 DEBUG nova.network.neutron [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Refreshing network info cache for port e29d0d86-36df-4494-820c-f61e5065ddc3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.668 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:53 np0005588920 nova_compute[226886]: 2026-01-20 14:51:53.898 226890 DEBUG nova.network.neutron [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:51:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:54 np0005588920 nova_compute[226886]: 2026-01-20 14:51:54.369 226890 DEBUG nova.network.neutron [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588920 nova_compute[226886]: 2026-01-20 14:51:54.388 226890 DEBUG oslo_concurrency.lockutils [req-06c1eb35-1135-42f5-8aa9-0c6a63ccf90c req-ce6846ea-7b33-4368-a54b-a4199b013fe5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:54 np0005588920 nova_compute[226886]: 2026-01-20 14:51:54.389 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquired lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:54 np0005588920 nova_compute[226886]: 2026-01-20 14:51:54.389 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:54 np0005588920 nova_compute[226886]: 2026-01-20 14:51:54.594 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:51:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:54.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:55 np0005588920 nova_compute[226886]: 2026-01-20 14:51:55.143 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:51:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:51:55 np0005588920 nova_compute[226886]: 2026-01-20 14:51:55.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:55.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:55 np0005588920 podman[268590]: 2026-01-20 14:51:55.979524785 +0000 UTC m=+0.057557863 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.082 226890 DEBUG nova.network.neutron [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Updating instance_info_cache with network_info: [{"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.100 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Releasing lock "refresh_cache-39b00621-bcfb-4aab-b143-42e00c43c0dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.100 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance network_info: |[{"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.102 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Start _get_guest_xml network_info=[{"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.110 226890 WARNING nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.116 226890 DEBUG nova.virt.libvirt.host [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.117 226890 DEBUG nova.virt.libvirt.host [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.121 226890 DEBUG nova.virt.libvirt.host [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.122 226890 DEBUG nova.virt.libvirt.host [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.124 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.124 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.125 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.126 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.126 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.127 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.127 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.128 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.129 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.129 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.129 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.130 226890 DEBUG nova.virt.hardware [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.133 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.562 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.600 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:56 np0005588920 nova_compute[226886]: 2026-01-20 14:51:56.606 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:56.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3771881679' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.061 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.063 226890 DEBUG nova.virt.libvirt.vif [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1192903785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1192903785',id=116,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49b57d772e7445cb96a38758bdb38839',ramdisk_id='',reservation_id='r-buooy2x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-158243655',owner_user_name='tempest-InstanceActionsV221TestJSON-158243655-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:51Z,user_data=None,user_id='aa11200594ea436083cff39ea4deb712',uuid=39b00621-bcfb-4aab-b143-42e00c43c0dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.063 226890 DEBUG nova.network.os_vif_util [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converting VIF {"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.064 226890 DEBUG nova.network.os_vif_util [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.066 226890 DEBUG nova.objects.instance [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39b00621-bcfb-4aab-b143-42e00c43c0dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.087 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <uuid>39b00621-bcfb-4aab-b143-42e00c43c0dd</uuid>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <name>instance-00000074</name>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:name>tempest-InstanceActionsV221TestJSON-server-1192903785</nova:name>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:51:56</nova:creationTime>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:user uuid="aa11200594ea436083cff39ea4deb712">tempest-InstanceActionsV221TestJSON-158243655-project-member</nova:user>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:project uuid="49b57d772e7445cb96a38758bdb38839">tempest-InstanceActionsV221TestJSON-158243655</nova:project>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <nova:port uuid="e29d0d86-36df-4494-820c-f61e5065ddc3">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="serial">39b00621-bcfb-4aab-b143-42e00c43c0dd</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="uuid">39b00621-bcfb-4aab-b143-42e00c43c0dd</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/39b00621-bcfb-4aab-b143-42e00c43c0dd_disk">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:3a:55:44"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <target dev="tape29d0d86-36"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/console.log" append="off"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:51:57 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:51:57 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:51:57 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:51:57 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.088 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Preparing to wait for external event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.089 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.089 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.089 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.090 226890 DEBUG nova.virt.libvirt.vif [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1192903785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1192903785',id=116,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49b57d772e7445cb96a38758bdb38839',ramdisk_id='',reservation_id='r-buooy2x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-158243655',owner_user_name='tempest-InstanceActionsV221TestJSON-158243655-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:51Z,user_data=None,user_id='aa11200594ea436083cff39ea4deb712',uuid=39b00621-bcfb-4aab-b143-42e00c43c0dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.090 226890 DEBUG nova.network.os_vif_util [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converting VIF {"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.091 226890 DEBUG nova.network.os_vif_util [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.092 226890 DEBUG os_vif [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.093 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.094 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.095 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.100 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape29d0d86-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.101 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape29d0d86-36, col_values=(('external_ids', {'iface-id': 'e29d0d86-36df-4494-820c-f61e5065ddc3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:55:44', 'vm-uuid': '39b00621-bcfb-4aab-b143-42e00c43c0dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:57 np0005588920 NetworkManager[49076]: <info>  [1768920717.1032] manager: (tape29d0d86-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.103 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.111 226890 INFO os_vif [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36')#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.171 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.172 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.172 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] No VIF found with MAC fa:16:3e:3a:55:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.172 226890 INFO nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Using config drive#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.204 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.905 226890 INFO nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Creating config drive at /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config#033[00m
Jan 20 09:51:57 np0005588920 nova_compute[226886]: 2026-01-20 14:51:57.913 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdv51isjf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:57.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.046 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdv51isjf" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.081 226890 DEBUG nova.storage.rbd_utils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] rbd image 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.085 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.226 226890 DEBUG oslo_concurrency.processutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config 39b00621-bcfb-4aab-b143-42e00c43c0dd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.227 226890 INFO nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Deleting local config drive /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd/disk.config because it was imported into RBD.#033[00m
Jan 20 09:51:58 np0005588920 kernel: tape29d0d86-36: entered promiscuous mode
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.2675] manager: (tape29d0d86-36): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:58Z|00552|binding|INFO|Claiming lport e29d0d86-36df-4494-820c-f61e5065ddc3 for this chassis.
Jan 20 09:51:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:58Z|00553|binding|INFO|e29d0d86-36df-4494-820c-f61e5065ddc3: Claiming fa:16:3e:3a:55:44 10.100.0.3
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.290 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:55:44 10.100.0.3'], port_security=['fa:16:3e:3a:55:44 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '39b00621-bcfb-4aab-b143-42e00c43c0dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c347502b-6836-4cd9-a846-16c6c878d910', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49b57d772e7445cb96a38758bdb38839', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c167a5b-5102-4fa0-ab52-aad817b51b21', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acd04ce9-6917-4246-8adc-41a149c9acb0, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e29d0d86-36df-4494-820c-f61e5065ddc3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.292 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e29d0d86-36df-4494-820c-f61e5065ddc3 in datapath c347502b-6836-4cd9-a846-16c6c878d910 bound to our chassis#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.293 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c347502b-6836-4cd9-a846-16c6c878d910#033[00m
Jan 20 09:51:58 np0005588920 systemd-machined[196121]: New machine qemu-54-instance-00000074.
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.309 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9602da-a8b6-4d89-9492-69989e9d868d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.311 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc347502b-61 in ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.313 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc347502b-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.313 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7e99a301-69b1-4791-9f6a-65a0a5a5b53a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.314 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2676aa41-d520-41a5-a669-28ef177d9ddf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 systemd[1]: Started Virtual Machine qemu-54-instance-00000074.
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.325 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[db224a0a-3311-4b20-8af9-e9cd2c031b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 systemd-udevd[268748]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.3479] device (tape29d0d86-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.3484] device (tape29d0d86-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.351 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6bfe0817-8605-4cc2-ae08-19b9b806ee60]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.357 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:58Z|00554|binding|INFO|Setting lport e29d0d86-36df-4494-820c-f61e5065ddc3 ovn-installed in OVS
Jan 20 09:51:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:58Z|00555|binding|INFO|Setting lport e29d0d86-36df-4494-820c-f61e5065ddc3 up in Southbound
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.381 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[473dde66-a591-4c0e-b2e2-3faa5dc1c95f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.387 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b8edff9-f758-4aeb-b66c-0495776a9163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.3876] manager: (tapc347502b-60): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.425 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[fee6eb65-bb66-4ba8-9047-f0d8cb35181d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.428 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0ab79a-5f19-4468-be95-54b60bea7b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.4664] device (tapc347502b-60): carrier: link connected
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.472 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c212ebe6-e1d0-4ddf-93ec-f42c33138882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.491 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3482d5-e48a-416f-b898-9083aa621b6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc347502b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c2:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576015, 'reachable_time': 42273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268779, 'error': None, 'target': 'ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.506 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48bf1875-ccef-4eeb-befb-617c9960c918]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:c281'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576015, 'tstamp': 576015}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268780, 'error': None, 'target': 'ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.522 226890 DEBUG nova.compute.manager [req-bb4221b1-72d2-4df6-b799-b47fcf826b44 req-b114c729-1f6f-47d9-bf28-e8106bac11cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.523 226890 DEBUG oslo_concurrency.lockutils [req-bb4221b1-72d2-4df6-b799-b47fcf826b44 req-b114c729-1f6f-47d9-bf28-e8106bac11cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.523 226890 DEBUG oslo_concurrency.lockutils [req-bb4221b1-72d2-4df6-b799-b47fcf826b44 req-b114c729-1f6f-47d9-bf28-e8106bac11cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.524 226890 DEBUG oslo_concurrency.lockutils [req-bb4221b1-72d2-4df6-b799-b47fcf826b44 req-b114c729-1f6f-47d9-bf28-e8106bac11cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.524 226890 DEBUG nova.compute.manager [req-bb4221b1-72d2-4df6-b799-b47fcf826b44 req-b114c729-1f6f-47d9-bf28-e8106bac11cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Processing event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.530 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5a188d61-1f2b-4b57-9a29-c4b49da79023]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc347502b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c2:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576015, 'reachable_time': 42273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268781, 'error': None, 'target': 'ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.565 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e6d9df-1c02-4ae8-9aba-f18f7ca5c6b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.625 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[828da399-ed76-489b-8717-b894909c1b86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.627 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc347502b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.627 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.628 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc347502b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 kernel: tapc347502b-60: entered promiscuous mode
Jan 20 09:51:58 np0005588920 NetworkManager[49076]: <info>  [1768920718.6329] manager: (tapc347502b-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.634 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc347502b-60, col_values=(('external_ids', {'iface-id': '97800f4d-9382-463e-9764-32d1417e471f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:51:58Z|00556|binding|INFO|Releasing lport 97800f4d-9382-463e-9764-32d1417e471f from this chassis (sb_readonly=0)
Jan 20 09:51:58 np0005588920 nova_compute[226886]: 2026-01-20 14:51:58.649 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.650 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c347502b-6836-4cd9-a846-16c6c878d910.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c347502b-6836-4cd9-a846-16c6c878d910.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.651 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f5639ce2-541d-4bb3-96bf-25c40761f2b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.651 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-c347502b-6836-4cd9-a846-16c6c878d910
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/c347502b-6836-4cd9-a846-16c6c878d910.pid.haproxy
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID c347502b-6836-4cd9-a846-16c6c878d910
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:51:58.652 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910', 'env', 'PROCESS_TAG=haproxy-c347502b-6836-4cd9-a846-16c6c878d910', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c347502b-6836-4cd9-a846-16c6c878d910.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:58.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:59 np0005588920 podman[268813]: 2026-01-20 14:51:59.071327074 +0000 UTC m=+0.068281025 container create 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:51:59 np0005588920 systemd[1]: Started libpod-conmon-109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769.scope.
Jan 20 09:51:59 np0005588920 podman[268813]: 2026-01-20 14:51:59.036919895 +0000 UTC m=+0.033873886 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:59 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:51:59 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38c5dc830394aead65e430fe859accad889d171b8e4c71f3d1c93e164b1bac5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:59 np0005588920 podman[268813]: 2026-01-20 14:51:59.164171779 +0000 UTC m=+0.161125760 container init 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:51:59 np0005588920 podman[268813]: 2026-01-20 14:51:59.170218079 +0000 UTC m=+0.167172030 container start 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:51:59 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [NOTICE]   (268833) : New worker (268835) forked
Jan 20 09:51:59 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [NOTICE]   (268833) : Loading success.
Jan 20 09:51:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/256008764' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.797 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920719.7971911, 39b00621-bcfb-4aab-b143-42e00c43c0dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.798 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.800 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.805 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.808 226890 INFO nova.virt.libvirt.driver [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance spawned successfully.#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.809 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.830 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.835 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.836 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.836 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.837 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.838 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.838 226890 DEBUG nova.virt.libvirt.driver [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.842 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.876 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.876 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920719.7973099, 39b00621-bcfb-4aab-b143-42e00c43c0dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.876 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.897 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.901 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920719.8026617, 39b00621-bcfb-4aab-b143-42e00c43c0dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.902 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.904 226890 INFO nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Took 8.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.905 226890 DEBUG nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.928 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.931 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.955 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:51:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:59.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.968 226890 INFO nova.compute.manager [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Took 9.08 seconds to build instance.#033[00m
Jan 20 09:51:59 np0005588920 nova_compute[226886]: 2026-01-20 14:51:59.985 226890 DEBUG oslo_concurrency.lockutils [None req-142b018b-6559-4fa6-beef-b46dcbf3c506 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.628 226890 DEBUG nova.compute.manager [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.628 226890 DEBUG oslo_concurrency.lockutils [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.629 226890 DEBUG oslo_concurrency.lockutils [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.629 226890 DEBUG oslo_concurrency.lockutils [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.630 226890 DEBUG nova.compute.manager [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] No waiting events found dispatching network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.630 226890 WARNING nova.compute.manager [req-1fbd54d9-2f1d-4ee6-9fce-7f7031ed738e req-631733cc-cdb5-438b-aab3-dfd88914456e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received unexpected event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:52:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:00.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:52:00 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:52:00 np0005588920 nova_compute[226886]: 2026-01-20 14:52:00.939 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.363 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.364 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.364 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.365 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.365 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.366 226890 INFO nova.compute.manager [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Terminating instance#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.367 226890 DEBUG nova.compute.manager [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:52:01 np0005588920 kernel: tape29d0d86-36 (unregistering): left promiscuous mode
Jan 20 09:52:01 np0005588920 NetworkManager[49076]: <info>  [1768920721.4136] device (tape29d0d86-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.470 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:01Z|00557|binding|INFO|Releasing lport e29d0d86-36df-4494-820c-f61e5065ddc3 from this chassis (sb_readonly=0)
Jan 20 09:52:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:01Z|00558|binding|INFO|Setting lport e29d0d86-36df-4494-820c-f61e5065ddc3 down in Southbound
Jan 20 09:52:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:01Z|00559|binding|INFO|Removing iface tape29d0d86-36 ovn-installed in OVS
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.474 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.478 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:55:44 10.100.0.3'], port_security=['fa:16:3e:3a:55:44 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '39b00621-bcfb-4aab-b143-42e00c43c0dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c347502b-6836-4cd9-a846-16c6c878d910', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49b57d772e7445cb96a38758bdb38839', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c167a5b-5102-4fa0-ab52-aad817b51b21', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acd04ce9-6917-4246-8adc-41a149c9acb0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e29d0d86-36df-4494-820c-f61e5065ddc3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.479 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e29d0d86-36df-4494-820c-f61e5065ddc3 in datapath c347502b-6836-4cd9-a846-16c6c878d910 unbound from our chassis#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.480 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c347502b-6836-4cd9-a846-16c6c878d910, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.481 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0eefccf1-a934-412c-8bfe-9745c17a7261]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.482 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910 namespace which is not needed anymore#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.487 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 20 09:52:01 np0005588920 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000074.scope: Consumed 3.008s CPU time.
Jan 20 09:52:01 np0005588920 systemd-machined[196121]: Machine qemu-54-instance-00000074 terminated.
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.598 226890 INFO nova.virt.libvirt.driver [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Instance destroyed successfully.#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.599 226890 DEBUG nova.objects.instance [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lazy-loading 'resources' on Instance uuid 39b00621-bcfb-4aab-b143-42e00c43c0dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [NOTICE]   (268833) : haproxy version is 2.8.14-c23fe91
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [NOTICE]   (268833) : path to executable is /usr/sbin/haproxy
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [WARNING]  (268833) : Exiting Master process...
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [WARNING]  (268833) : Exiting Master process...
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [ALERT]    (268833) : Current worker (268835) exited with code 143 (Terminated)
Jan 20 09:52:01 np0005588920 neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910[268828]: [WARNING]  (268833) : All workers exited. Exiting... (0)
Jan 20 09:52:01 np0005588920 systemd[1]: libpod-109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769.scope: Deactivated successfully.
Jan 20 09:52:01 np0005588920 podman[268957]: 2026-01-20 14:52:01.637634861 +0000 UTC m=+0.075097187 container died 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.640 226890 DEBUG nova.virt.libvirt.vif [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1192903785',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1192903785',id=116,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:51:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='49b57d772e7445cb96a38758bdb38839',ramdisk_id='',reservation_id='r-buooy2x3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-158243655',owner_user_name='tempest-InstanceActionsV221TestJSON-158243655-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:59Z,user_data=None,user_id='aa11200594ea436083cff39ea4deb712',uuid=39b00621-bcfb-4aab-b143-42e00c43c0dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.641 226890 DEBUG nova.network.os_vif_util [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converting VIF {"id": "e29d0d86-36df-4494-820c-f61e5065ddc3", "address": "fa:16:3e:3a:55:44", "network": {"id": "c347502b-6836-4cd9-a846-16c6c878d910", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-882734659-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49b57d772e7445cb96a38758bdb38839", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape29d0d86-36", "ovs_interfaceid": "e29d0d86-36df-4494-820c-f61e5065ddc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.642 226890 DEBUG nova.network.os_vif_util [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.642 226890 DEBUG os_vif [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.644 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.645 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape29d0d86-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.650 226890 INFO os_vif [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:55:44,bridge_name='br-int',has_traffic_filtering=True,id=e29d0d86-36df-4494-820c-f61e5065ddc3,network=Network(c347502b-6836-4cd9-a846-16c6c878d910),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape29d0d86-36')#033[00m
Jan 20 09:52:01 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769-userdata-shm.mount: Deactivated successfully.
Jan 20 09:52:01 np0005588920 systemd[1]: var-lib-containers-storage-overlay-38c5dc830394aead65e430fe859accad889d171b8e4c71f3d1c93e164b1bac5c-merged.mount: Deactivated successfully.
Jan 20 09:52:01 np0005588920 podman[268957]: 2026-01-20 14:52:01.681753593 +0000 UTC m=+0.119215919 container cleanup 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:52:01 np0005588920 systemd[1]: libpod-conmon-109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769.scope: Deactivated successfully.
Jan 20 09:52:01 np0005588920 podman[269014]: 2026-01-20 14:52:01.736052293 +0000 UTC m=+0.035692857 container remove 109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.741 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[507900e0-7bb8-498c-82e9-0294782a1a44]: (4, ('Tue Jan 20 02:52:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910 (109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769)\n109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769\nTue Jan 20 02:52:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910 (109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769)\n109facd10dd2c902f3dc61abae8a6726adf3a29ce9aac4647a3973ced086f769\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.743 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[db4f35a8-193b-4bf0-b265-7aabd57228d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.744 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc347502b-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.746 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 kernel: tapc347502b-60: left promiscuous mode
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.761 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.765 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f1da16b3-5e61-4228-b7b0-46a1fbe8d16e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.778 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0997e1c5-16d7-4e1f-8728-ec32fba8b066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.779 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60d8b2b5-578b-498a-a932-d8147f1546a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.795 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d553db02-a315-486b-8781-6b1dd10f53a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576006, 'reachable_time': 15941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269029, 'error': None, 'target': 'ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 systemd[1]: run-netns-ovnmeta\x2dc347502b\x2d6836\x2d4cd9\x2da846\x2d16c6c878d910.mount: Deactivated successfully.
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.799 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c347502b-6836-4cd9-a846-16c6c878d910 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:52:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:01.799 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3399d203-7891-4dec-ac06-2d942c643322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:01Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:52:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:01Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:1f:c0 10.100.0.5
Jan 20 09:52:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:01.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.990 226890 INFO nova.virt.libvirt.driver [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Deleting instance files /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd_del#033[00m
Jan 20 09:52:01 np0005588920 nova_compute[226886]: 2026-01-20 14:52:01.991 226890 INFO nova.virt.libvirt.driver [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Deletion of /var/lib/nova/instances/39b00621-bcfb-4aab-b143-42e00c43c0dd_del complete#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.053 226890 INFO nova.compute.manager [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.053 226890 DEBUG oslo.service.loopingcall [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.054 226890 DEBUG nova.compute.manager [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.054 226890 DEBUG nova.network.neutron [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:52:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.757 226890 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-unplugged-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.757 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.757 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.758 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.758 226890 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] No waiting events found dispatching network-vif-unplugged-e29d0d86-36df-4494-820c-f61e5065ddc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.758 226890 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-unplugged-e29d0d86-36df-4494-820c-f61e5065ddc3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.758 226890 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.759 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.759 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.759 226890 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.759 226890 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] No waiting events found dispatching network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.760 226890 WARNING nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received unexpected event network-vif-plugged-e29d0d86-36df-4494-820c-f61e5065ddc3 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:52:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:02.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.935 226890 DEBUG nova.network.neutron [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:02 np0005588920 nova_compute[226886]: 2026-01-20 14:52:02.956 226890 INFO nova.compute.manager [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Took 0.90 seconds to deallocate network for instance.#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.047 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.048 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.134 226890 DEBUG oslo_concurrency.processutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3728354836' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.564 226890 DEBUG oslo_concurrency.processutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.574 226890 DEBUG nova.compute.provider_tree [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.596 226890 DEBUG nova.scheduler.client.report [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.623 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.645 226890 INFO nova.scheduler.client.report [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Deleted allocations for instance 39b00621-bcfb-4aab-b143-42e00c43c0dd#033[00m
Jan 20 09:52:03 np0005588920 nova_compute[226886]: 2026-01-20 14:52:03.716 226890 DEBUG oslo_concurrency.lockutils [None req-1abb8113-5703-4ed6-9dbf-794ffaaa4619 aa11200594ea436083cff39ea4deb712 49b57d772e7445cb96a38758bdb38839 - - default default] Lock "39b00621-bcfb-4aab-b143-42e00c43c0dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:03.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:04.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:05 np0005588920 nova_compute[226886]: 2026-01-20 14:52:05.178 226890 DEBUG nova.compute.manager [req-183cc910-17bd-4518-b418-c94abc6b867a req-81f0d1c5-19f7-4d65-8247-53c451152f16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Received event network-vif-deleted-e29d0d86-36df-4494-820c-f61e5065ddc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:05 np0005588920 nova_compute[226886]: 2026-01-20 14:52:05.941 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:05.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:06 np0005588920 nova_compute[226886]: 2026-01-20 14:52:06.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:06.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 20 09:52:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:07.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 20 09:52:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:08.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 20 09:52:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:09.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:10.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:10 np0005588920 nova_compute[226886]: 2026-01-20 14:52:10.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:11 np0005588920 nova_compute[226886]: 2026-01-20 14:52:11.649 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:11.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:12.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:12.810 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:12.811 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:52:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:12.812 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:12 np0005588920 nova_compute[226886]: 2026-01-20 14:52:12.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:13 np0005588920 ovn_controller[133971]: 2026-01-20T14:52:13Z|00560|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:52:13 np0005588920 nova_compute[226886]: 2026-01-20 14:52:13.600 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:13.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 20 09:52:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:14.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:15 np0005588920 nova_compute[226886]: 2026-01-20 14:52:15.947 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:15.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:16.453 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:16.454 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:52:16.455 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:16 np0005588920 nova_compute[226886]: 2026-01-20 14:52:16.597 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920721.5961032, 39b00621-bcfb-4aab-b143-42e00c43c0dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:16 np0005588920 nova_compute[226886]: 2026-01-20 14:52:16.598 226890 INFO nova.compute.manager [-] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:52:16 np0005588920 nova_compute[226886]: 2026-01-20 14:52:16.651 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:16.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:17 np0005588920 podman[269053]: 2026-01-20 14:52:17.042997147 +0000 UTC m=+0.131639149 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:52:17 np0005588920 nova_compute[226886]: 2026-01-20 14:52:17.127 226890 DEBUG nova.compute.manager [None req-38f3fb09-1937-45bf-ac53-8c21572e2446 - - - - - -] [instance: 39b00621-bcfb-4aab-b143-42e00c43c0dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:17.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:18.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:19.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:20.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:20 np0005588920 nova_compute[226886]: 2026-01-20 14:52:20.949 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:21 np0005588920 nova_compute[226886]: 2026-01-20 14:52:21.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:21.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:22.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:23.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:25 np0005588920 nova_compute[226886]: 2026-01-20 14:52:25.952 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:25.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:26 np0005588920 nova_compute[226886]: 2026-01-20 14:52:26.654 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:26.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:27 np0005588920 podman[269080]: 2026-01-20 14:52:27.003011896 +0000 UTC m=+0.075218620 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:52:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:27.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:28.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:30.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:30.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:30 np0005588920 nova_compute[226886]: 2026-01-20 14:52:30.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:31 np0005588920 nova_compute[226886]: 2026-01-20 14:52:31.656 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:32.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:32.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:34.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:34.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:35 np0005588920 nova_compute[226886]: 2026-01-20 14:52:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:35 np0005588920 nova_compute[226886]: 2026-01-20 14:52:35.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:52:35 np0005588920 nova_compute[226886]: 2026-01-20 14:52:35.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:52:35 np0005588920 nova_compute[226886]: 2026-01-20 14:52:35.955 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:36.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:36 np0005588920 nova_compute[226886]: 2026-01-20 14:52:36.657 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588920 nova_compute[226886]: 2026-01-20 14:52:36.813 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:52:36 np0005588920 nova_compute[226886]: 2026-01-20 14:52:36.813 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:52:36 np0005588920 nova_compute[226886]: 2026-01-20 14:52:36.813 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:52:36 np0005588920 nova_compute[226886]: 2026-01-20 14:52:36.813 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:36.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 20 09:52:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:38.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 20 09:52:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:38.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 20 09:52:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:40.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:52:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:40.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:52:40 np0005588920 nova_compute[226886]: 2026-01-20 14:52:40.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.487 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.503 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.504 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.504 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.504 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.505 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.505 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.544 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.545 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.545 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.545 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.545 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.659 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2883729981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:41 np0005588920 nova_compute[226886]: 2026-01-20 14:52:41.974 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:42.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.058 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.059 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.202 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.203 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4226MB free_disk=20.806049346923828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.204 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:42 np0005588920 nova_compute[226886]: 2026-01-20 14:52:42.204 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:42.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:43 np0005588920 nova_compute[226886]: 2026-01-20 14:52:43.801 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:52:43 np0005588920 nova_compute[226886]: 2026-01-20 14:52:43.802 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:52:43 np0005588920 nova_compute[226886]: 2026-01-20 14:52:43.802 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:52:43 np0005588920 nova_compute[226886]: 2026-01-20 14:52:43.890 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:44.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1761720748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:44 np0005588920 nova_compute[226886]: 2026-01-20 14:52:44.351 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:44 np0005588920 nova_compute[226886]: 2026-01-20 14:52:44.358 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:52:44 np0005588920 nova_compute[226886]: 2026-01-20 14:52:44.441 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:52:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 20 09:52:44 np0005588920 nova_compute[226886]: 2026-01-20 14:52:44.714 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:52:44 np0005588920 nova_compute[226886]: 2026-01-20 14:52:44.714 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:44.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:45 np0005588920 nova_compute[226886]: 2026-01-20 14:52:45.959 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:46.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:46 np0005588920 nova_compute[226886]: 2026-01-20 14:52:46.662 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:46.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:46 np0005588920 nova_compute[226886]: 2026-01-20 14:52:46.935 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:46 np0005588920 nova_compute[226886]: 2026-01-20 14:52:46.936 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:47 np0005588920 nova_compute[226886]: 2026-01-20 14:52:47.044 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:47 np0005588920 nova_compute[226886]: 2026-01-20 14:52:47.045 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:47 np0005588920 nova_compute[226886]: 2026-01-20 14:52:47.045 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:52:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:47 np0005588920 nova_compute[226886]: 2026-01-20 14:52:47.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:48.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:48 np0005588920 podman[269145]: 2026-01-20 14:52:48.056031791 +0000 UTC m=+0.134918531 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:52:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:48.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:52:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:50.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:52:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 20 09:52:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:50.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:50 np0005588920 nova_compute[226886]: 2026-01-20 14:52:50.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 20 09:52:51 np0005588920 nova_compute[226886]: 2026-01-20 14:52:51.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:52.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 20 09:52:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:52.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:54.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:54.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:55 np0005588920 nova_compute[226886]: 2026-01-20 14:52:55.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:56.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:56 np0005588920 nova_compute[226886]: 2026-01-20 14:52:56.666 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:56.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:57 np0005588920 podman[269171]: 2026-01-20 14:52:57.978762198 +0000 UTC m=+0.068873951 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 09:52:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:58.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:52:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:58.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 20 09:53:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:00.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 20 09:53:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:00.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:00 np0005588920 nova_compute[226886]: 2026-01-20 14:53:00.972 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:01 np0005588920 nova_compute[226886]: 2026-01-20 14:53:01.690 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 20 09:53:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:02.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 20 09:53:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:53:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:53:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:02.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:04.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:04.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:05 np0005588920 nova_compute[226886]: 2026-01-20 14:53:05.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:06.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:06 np0005588920 nova_compute[226886]: 2026-01-20 14:53:06.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:06.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 20 09:53:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:08.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:08.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:09 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 20 09:53:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:10.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.643 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.644 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.675 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.767 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.767 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.774 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.774 226890 INFO nova.compute.claims [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:53:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:10.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.958 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:10 np0005588920 nova_compute[226886]: 2026-01-20 14:53:10.981 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1108883090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.404 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.410 226890 DEBUG nova.compute.provider_tree [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.484 226890 DEBUG nova.scheduler.client.report [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.546 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.547 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.686 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.687 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.694 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.785 226890 INFO nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:53:11 np0005588920 nova_compute[226886]: 2026-01-20 14:53:11.849 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:53:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:12.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.142 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.144 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.144 226890 INFO nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating image(s)#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.167 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.195 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.221 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.225 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.296 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.299 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.300 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.300 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.329 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.333 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 49919d3f-fab0-404f-a0a0-82610973a254_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.384 226890 DEBUG nova.policy [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34eb73f628994c11801d447148d5f142', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1e83af992c94112a965575784639d77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:53:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.626 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 49919d3f-fab0-404f-a0a0-82610973a254_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.699 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] resizing rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.800 226890 DEBUG nova.objects.instance [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'migration_context' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.847 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.848 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Ensure instance console log exists: /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.848 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.848 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:12 np0005588920 nova_compute[226886]: 2026-01-20 14:53:12.849 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:12.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 20 09:53:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:13.727 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:13 np0005588920 nova_compute[226886]: 2026-01-20 14:53:13.728 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:13.729 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:53:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:14.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:14 np0005588920 nova_compute[226886]: 2026-01-20 14:53:14.184 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Successfully created port: 54d190c9-33af-46a2-a141-ff83769d93c6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:53:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 20 09:53:14 np0005588920 nova_compute[226886]: 2026-01-20 14:53:14.675 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588920 NetworkManager[49076]: <info>  [1768920794.6765] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Jan 20 09:53:14 np0005588920 NetworkManager[49076]: <info>  [1768920794.6776] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 20 09:53:14 np0005588920 nova_compute[226886]: 2026-01-20 14:53:14.854 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:14Z|00561|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:53:14 np0005588920 nova_compute[226886]: 2026-01-20 14:53:14.875 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:14.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:15.730 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:15 np0005588920 nova_compute[226886]: 2026-01-20 14:53:15.823 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Successfully updated port: 54d190c9-33af-46a2-a141-ff83769d93c6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:53:15 np0005588920 nova_compute[226886]: 2026-01-20 14:53:15.849 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:15 np0005588920 nova_compute[226886]: 2026-01-20 14:53:15.849 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:15 np0005588920 nova_compute[226886]: 2026-01-20 14:53:15.849 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:15 np0005588920 nova_compute[226886]: 2026-01-20 14:53:15.978 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:16 np0005588920 nova_compute[226886]: 2026-01-20 14:53:16.075 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:53:16 np0005588920 nova_compute[226886]: 2026-01-20 14:53:16.184 226890 DEBUG nova.compute.manager [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:16 np0005588920 nova_compute[226886]: 2026-01-20 14:53:16.185 226890 DEBUG nova.compute.manager [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing instance network info cache due to event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:16 np0005588920 nova_compute[226886]: 2026-01-20 14:53:16.185 226890 DEBUG oslo_concurrency.lockutils [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:16.455 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:16.455 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:16.456 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:16 np0005588920 nova_compute[226886]: 2026-01-20 14:53:16.696 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:16.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/720863058' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.396 226890 DEBUG nova.network.neutron [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.426 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.426 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance network_info: |[{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.427 226890 DEBUG oslo_concurrency.lockutils [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.427 226890 DEBUG nova.network.neutron [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.429 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start _get_guest_xml network_info=[{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.437 226890 WARNING nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.442 226890 DEBUG nova.virt.libvirt.host [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.443 226890 DEBUG nova.virt.libvirt.host [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.452 226890 DEBUG nova.virt.libvirt.host [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.454 226890 DEBUG nova.virt.libvirt.host [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.456 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.457 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.458 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.458 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.459 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.459 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.460 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.460 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.461 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.462 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.462 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.463 226890 DEBUG nova.virt.hardware [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.468 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/894774041' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.898 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.931 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:17 np0005588920 nova_compute[226886]: 2026-01-20 14:53:17.934 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:18.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1212389849' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.351 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.353 226890 DEBUG nova.virt.libvirt.vif [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGm3YXyDls3m8lgQsMp7i5z2Ji2kt+QoAKyNgN4cUeQAncl8sITzJAcvU8MP7QXpcIT5PJILYBp9zVzJhusCSqycT+8/Be6bl9GRyoq123x5/AtCBhaSdlyObjct+Gfsjw==',key_name='tempest-keypair-99379181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.354 226890 DEBUG nova.network.os_vif_util [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.355 226890 DEBUG nova.network.os_vif_util [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.356 226890 DEBUG nova.objects.instance [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.385 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <uuid>49919d3f-fab0-404f-a0a0-82610973a254</uuid>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <name>instance-00000078</name>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-1885581249</nova:name>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:53:17</nova:creationTime>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:user uuid="34eb73f628994c11801d447148d5f142">tempest-AttachVolumeShelveTestJSON-896995479-project-member</nova:user>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:project uuid="b1e83af992c94112a965575784639d77">tempest-AttachVolumeShelveTestJSON-896995479</nova:project>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <nova:port uuid="54d190c9-33af-46a2-a141-ff83769d93c6">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="serial">49919d3f-fab0-404f-a0a0-82610973a254</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="uuid">49919d3f-fab0-404f-a0a0-82610973a254</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/49919d3f-fab0-404f-a0a0-82610973a254_disk">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/49919d3f-fab0-404f-a0a0-82610973a254_disk.config">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1f:7c:1d"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <target dev="tap54d190c9-33"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/console.log" append="off"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:53:18 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:53:18 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:53:18 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:53:18 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.386 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Preparing to wait for external event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.387 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.387 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.387 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.388 226890 DEBUG nova.virt.libvirt.vif [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGm3YXyDls3m8lgQsMp7i5z2Ji2kt+QoAKyNgN4cUeQAncl8sITzJAcvU8MP7QXpcIT5PJILYBp9zVzJhusCSqycT+8/Be6bl9GRyoq123x5/AtCBhaSdlyObjct+Gfsjw==',key_name='tempest-keypair-99379181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.388 226890 DEBUG nova.network.os_vif_util [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.389 226890 DEBUG nova.network.os_vif_util [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.389 226890 DEBUG os_vif [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.390 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.391 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.396 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.396 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54d190c9-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.397 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54d190c9-33, col_values=(('external_ids', {'iface-id': '54d190c9-33af-46a2-a141-ff83769d93c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:7c:1d', 'vm-uuid': '49919d3f-fab0-404f-a0a0-82610973a254'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:18 np0005588920 NetworkManager[49076]: <info>  [1768920798.3992] manager: (tap54d190c9-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.404 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.405 226890 INFO os_vif [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33')#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.449 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.450 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.450 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No VIF found with MAC fa:16:3e:1f:7c:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.450 226890 INFO nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Using config drive#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.474 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:18.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.950 226890 INFO nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating config drive at /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config#033[00m
Jan 20 09:53:18 np0005588920 nova_compute[226886]: 2026-01-20 14:53:18.955 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqcqiu9qp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:19 np0005588920 podman[269644]: 2026-01-20 14:53:19.039789997 +0000 UTC m=+0.120257489 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.086 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqcqiu9qp" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.118 226890 DEBUG nova.storage.rbd_utils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.123 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config 49919d3f-fab0-404f-a0a0-82610973a254_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.290 226890 DEBUG oslo_concurrency.processutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config 49919d3f-fab0-404f-a0a0-82610973a254_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.291 226890 INFO nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deleting local config drive /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config because it was imported into RBD.#033[00m
Jan 20 09:53:19 np0005588920 kernel: tap54d190c9-33: entered promiscuous mode
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.3384] manager: (tap54d190c9-33): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:19Z|00562|binding|INFO|Claiming lport 54d190c9-33af-46a2-a141-ff83769d93c6 for this chassis.
Jan 20 09:53:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:19Z|00563|binding|INFO|54d190c9-33af-46a2-a141-ff83769d93c6: Claiming fa:16:3e:1f:7c:1d 10.100.0.8
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.347 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:7c:1d 10.100.0.8'], port_security=['fa:16:3e:1f:7c:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49919d3f-fab0-404f-a0a0-82610973a254', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da0e73ee-8414-4d81-a0bf-09363bb8a1b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=54d190c9-33af-46a2-a141-ff83769d93c6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.348 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 54d190c9-33af-46a2-a141-ff83769d93c6 in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 bound to our chassis#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.349 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9589011-b728-4b79-9945-aa6c52dd0fc2#033[00m
Jan 20 09:53:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:19Z|00564|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 ovn-installed in OVS
Jan 20 09:53:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:19Z|00565|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 up in Southbound
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.358 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.360 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c523dddc-61fd-4d25-8807-2928ebc42c32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.362 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9589011-b1 in ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.364 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9589011-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.365 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[46aa9f31-a283-4bc4-8a02-93c9a3fd8894]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.366 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cf652414-d61b-4bbc-8fa4-a1b0bf5d5177]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 systemd-machined[196121]: New machine qemu-55-instance-00000078.
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.377 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[91de3fee-91ee-414a-8270-c8ed5dfffc61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 systemd[1]: Started Virtual Machine qemu-55-instance-00000078.
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb0beb9-fa37-4854-9a1f-82cc2b0fc136]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 systemd-udevd[269725]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.4120] device (tap54d190c9-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.4127] device (tap54d190c9-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.426 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb17303-c435-4982-975a-07b965de1b0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.434 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[28e2272e-07af-4993-9f74-7f387db4a80e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.4359] manager: (tape9589011-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/282)
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.470 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce830e1-13d8-47a3-95aa-4c7ac45a47b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.473 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[29e700e8-4fcc-44ab-b12b-38415fea4a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.4988] device (tape9589011-b0): carrier: link connected
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.505 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[91a0762f-4174-497a-b263-f0d2759a26da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.521 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c48858-1861-4d0b-9533-ad00867235b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584118, 'reachable_time': 26397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269755, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.534 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c02a02e1-f64c-41cb-a279-947491fa9608]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:5a14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 584118, 'tstamp': 584118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269756, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.552 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7212e2-381c-4373-b981-18f05e737a1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584118, 'reachable_time': 26397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269757, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.585 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[55eef644-20cb-4b64-8f90-e2cb6c97db18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.644 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc63327-041b-4686-bcb8-9bec4d660419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.645 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.645 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.646 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9589011-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:19 np0005588920 NetworkManager[49076]: <info>  [1768920799.6484] manager: (tape9589011-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588920 kernel: tape9589011-b0: entered promiscuous mode
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.650 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9589011-b0, col_values=(('external_ids', {'iface-id': '9ca9d06a-9365-4769-a2c4-7322625683ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:19 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:19Z|00566|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:53:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 20 09:53:19 np0005588920 nova_compute[226886]: 2026-01-20 14:53:19.668 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.669 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.670 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[052e406e-44e3-43b8-8207-8bb2fb71d243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.671 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:53:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:19.671 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'env', 'PROCESS_TAG=haproxy-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9589011-b728-4b79-9945-aa6c52dd0fc2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:53:20 np0005588920 podman[269829]: 2026-01-20 14:53:20.051091152 +0000 UTC m=+0.053938420 container create c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.063 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920800.062766, 49919d3f-fab0-404f-a0a0-82610973a254 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.063 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Started (Lifecycle Event)#033[00m
Jan 20 09:53:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:20.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:20 np0005588920 systemd[1]: Started libpod-conmon-c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809.scope.
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.091 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.098 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920800.0655758, 49919d3f-fab0-404f-a0a0-82610973a254 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.098 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:53:20 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:53:20 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c715ea318109a6ff06825ed6abfff1b2d73cd44b23f7f2047d0a53d5ddf310ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:53:20 np0005588920 podman[269829]: 2026-01-20 14:53:20.02508918 +0000 UTC m=+0.027936468 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.121 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.126 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:20 np0005588920 podman[269829]: 2026-01-20 14:53:20.128719399 +0000 UTC m=+0.131566687 container init c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:53:20 np0005588920 podman[269829]: 2026-01-20 14:53:20.135393987 +0000 UTC m=+0.138241255 container start c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.153 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:20 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [NOTICE]   (269850) : New worker (269852) forked
Jan 20 09:53:20 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [NOTICE]   (269850) : Loading success.
Jan 20 09:53:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:20.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:20 np0005588920 nova_compute[226886]: 2026-01-20 14:53:20.981 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:22.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.089 226890 DEBUG nova.network.neutron [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updated VIF entry in instance network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.090 226890 DEBUG nova.network.neutron [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.154 226890 DEBUG oslo_concurrency.lockutils [req-0027fdfe-4d78-4817-a65c-1d7937403715 req-35c0e047-c81d-40fe-a8e7-e5d55e0928eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.554 226890 DEBUG nova.compute.manager [req-5a89d480-49ec-412c-8a78-965b793e4cd6 req-78fc98ae-4d92-40d5-88b2-0be8ff697c3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.554 226890 DEBUG oslo_concurrency.lockutils [req-5a89d480-49ec-412c-8a78-965b793e4cd6 req-78fc98ae-4d92-40d5-88b2-0be8ff697c3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.554 226890 DEBUG oslo_concurrency.lockutils [req-5a89d480-49ec-412c-8a78-965b793e4cd6 req-78fc98ae-4d92-40d5-88b2-0be8ff697c3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.555 226890 DEBUG oslo_concurrency.lockutils [req-5a89d480-49ec-412c-8a78-965b793e4cd6 req-78fc98ae-4d92-40d5-88b2-0be8ff697c3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.555 226890 DEBUG nova.compute.manager [req-5a89d480-49ec-412c-8a78-965b793e4cd6 req-78fc98ae-4d92-40d5-88b2-0be8ff697c3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Processing event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.555 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.559 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920802.5588498, 49919d3f-fab0-404f-a0a0-82610973a254 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.559 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.560 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.564 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance spawned successfully.#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.564 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.601 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.608 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.612 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.612 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.613 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.613 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.613 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.614 226890 DEBUG nova.virt.libvirt.driver [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.659 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.708 226890 INFO nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Took 10.57 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.708 226890 DEBUG nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.774 226890 INFO nova.compute.manager [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Took 12.04 seconds to build instance.#033[00m
Jan 20 09:53:22 np0005588920 nova_compute[226886]: 2026-01-20 14:53:22.805 226890 DEBUG oslo_concurrency.lockutils [None req-f9f2eecc-a062-40d5-b6ae-3be27c189955 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:22.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:23 np0005588920 nova_compute[226886]: 2026-01-20 14:53:23.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:24.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.648 226890 DEBUG nova.compute.manager [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.649 226890 DEBUG oslo_concurrency.lockutils [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.649 226890 DEBUG oslo_concurrency.lockutils [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.649 226890 DEBUG oslo_concurrency.lockutils [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.649 226890 DEBUG nova.compute.manager [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:24 np0005588920 nova_compute[226886]: 2026-01-20 14:53:24.650 226890 WARNING nova.compute.manager [req-8dc6ea3e-d86a-4f79-944e-b586ba2deee5 req-7bdedb62-7641-4a8a-960d-e4aa1d60e881 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received unexpected event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:53:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:24.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.583 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.584 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.609 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.685 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.686 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.694 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.695 226890 INFO nova.compute.claims [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.864 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:25 np0005588920 nova_compute[226886]: 2026-01-20 14:53:25.982 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500097760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.291 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.297 226890 DEBUG nova.compute.provider_tree [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.321 226890 DEBUG nova.scheduler.client.report [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.341 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.342 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.416 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.416 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.436 226890 INFO nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.454 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.571 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.573 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.573 226890 INFO nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Creating image(s)#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.596 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.623 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.651 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.655 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.717 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.719 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.720 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.720 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.751 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:26 np0005588920 nova_compute[226886]: 2026-01-20 14:53:26.756 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:26.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.055 226890 DEBUG nova.compute.manager [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.055 226890 DEBUG nova.compute.manager [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing instance network info cache due to event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.057 226890 DEBUG oslo_concurrency.lockutils [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.057 226890 DEBUG oslo_concurrency.lockutils [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.057 226890 DEBUG nova.network.neutron [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.167 226890 DEBUG nova.policy [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '215db37373dc4ae5a75cbd6866f471da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.212 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.293 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] resizing rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.392 226890 DEBUG nova.objects.instance [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'migration_context' on Instance uuid ce0152a6-7d4d-4eac-9587-a43ad934d9cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.404 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.404 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Ensure instance console log exists: /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.404 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.405 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:27 np0005588920 nova_compute[226886]: 2026-01-20 14:53:27.405 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:28.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:28 np0005588920 nova_compute[226886]: 2026-01-20 14:53:28.401 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:28.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:28 np0005588920 podman[270050]: 2026-01-20 14:53:28.961427768 +0000 UTC m=+0.052692865 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:53:29 np0005588920 nova_compute[226886]: 2026-01-20 14:53:29.954 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Successfully created port: 083e3cc0-e665-4049-a47b-233abf07b9d5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:53:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:30.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.601 226890 DEBUG nova.network.neutron [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updated VIF entry in instance network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.602 226890 DEBUG nova.network.neutron [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.620 226890 DEBUG oslo_concurrency.lockutils [req-9566e6c2-33bb-4fd3-aea9-7a22dbbeb7ec req-b344a46f-8f35-42af-9873-4b73da64d95b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.855 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Successfully updated port: 083e3cc0-e665-4049-a47b-233abf07b9d5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.875 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.875 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.875 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:30 np0005588920 nova_compute[226886]: 2026-01-20 14:53:30.986 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:31 np0005588920 nova_compute[226886]: 2026-01-20 14:53:31.029 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:53:31 np0005588920 nova_compute[226886]: 2026-01-20 14:53:31.475 226890 DEBUG nova.compute.manager [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-changed-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:31 np0005588920 nova_compute[226886]: 2026-01-20 14:53:31.476 226890 DEBUG nova.compute.manager [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Refreshing instance network info cache due to event network-changed-083e3cc0-e665-4049-a47b-233abf07b9d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:31 np0005588920 nova_compute[226886]: 2026-01-20 14:53:31.476 226890 DEBUG oslo_concurrency.lockutils [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:32.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:33.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.071 226890 DEBUG nova.network.neutron [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.103 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.103 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Instance network_info: |[{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.104 226890 DEBUG oslo_concurrency.lockutils [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.104 226890 DEBUG nova.network.neutron [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Refreshing network info cache for port 083e3cc0-e665-4049-a47b-233abf07b9d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.107 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Start _get_guest_xml network_info=[{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.112 226890 WARNING nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.116 226890 DEBUG nova.virt.libvirt.host [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.117 226890 DEBUG nova.virt.libvirt.host [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.123 226890 DEBUG nova.virt.libvirt.host [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.123 226890 DEBUG nova.virt.libvirt.host [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.125 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.125 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.125 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.126 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.126 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.126 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.126 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.126 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.127 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.127 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.127 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.127 226890 DEBUG nova.virt.hardware [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.130 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.405 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3836508932' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.725 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.595s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.753 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:33 np0005588920 nova_compute[226886]: 2026-01-20 14:53:33.759 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:34.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:34 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/654798905' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.181 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.184 226890 DEBUG nova.virt.libvirt.vif [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:53:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1580304238',display_name='tempest-ServerActionsTestOtherB-server-1580304238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1580304238',id=122,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-932d0bwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:26Z,user_data=None,user_id='215db37373dc4ae5a75cbd6866f471da',uuid=ce0152a6-7d4d-4eac-9587-a43ad934d9cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.184 226890 DEBUG nova.network.os_vif_util [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.186 226890 DEBUG nova.network.os_vif_util [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.187 226890 DEBUG nova.objects.instance [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid ce0152a6-7d4d-4eac-9587-a43ad934d9cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.209 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <uuid>ce0152a6-7d4d-4eac-9587-a43ad934d9cc</uuid>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <name>instance-0000007a</name>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestOtherB-server-1580304238</nova:name>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:53:33</nova:creationTime>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:user uuid="215db37373dc4ae5a75cbd6866f471da">tempest-ServerActionsTestOtherB-1136521362-project-member</nova:user>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:project uuid="b3b1b7f5b4f84b5abbc401eb577c85c0">tempest-ServerActionsTestOtherB-1136521362</nova:project>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <nova:port uuid="083e3cc0-e665-4049-a47b-233abf07b9d5">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="serial">ce0152a6-7d4d-4eac-9587-a43ad934d9cc</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="uuid">ce0152a6-7d4d-4eac-9587-a43ad934d9cc</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:6a:15:6d"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <target dev="tap083e3cc0-e6"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/console.log" append="off"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:53:34 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:53:34 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:53:34 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:53:34 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.211 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Preparing to wait for external event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.212 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.212 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.212 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.213 226890 DEBUG nova.virt.libvirt.vif [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:53:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1580304238',display_name='tempest-ServerActionsTestOtherB-server-1580304238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1580304238',id=122,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-932d0bwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:26Z,user_data=None,user_id='215db37373dc4ae5a75cbd6866f471da',uuid=ce0152a6-7d4d-4eac-9587-a43ad934d9cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.213 226890 DEBUG nova.network.os_vif_util [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.214 226890 DEBUG nova.network.os_vif_util [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.214 226890 DEBUG os_vif [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.216 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.217 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.221 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap083e3cc0-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.222 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap083e3cc0-e6, col_values=(('external_ids', {'iface-id': '083e3cc0-e665-4049-a47b-233abf07b9d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:15:6d', 'vm-uuid': 'ce0152a6-7d4d-4eac-9587-a43ad934d9cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:34 np0005588920 NetworkManager[49076]: <info>  [1768920814.2250] manager: (tap083e3cc0-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.235 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.237 226890 INFO os_vif [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6')#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.422 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.423 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.423 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No VIF found with MAC fa:16:3e:6a:15:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.423 226890 INFO nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Using config drive#033[00m
Jan 20 09:53:34 np0005588920 nova_compute[226886]: 2026-01-20 14:53:34.452 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:35.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.094 226890 DEBUG nova.network.neutron [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updated VIF entry in instance network info cache for port 083e3cc0-e665-4049-a47b-233abf07b9d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.095 226890 DEBUG nova.network.neutron [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.114 226890 DEBUG oslo_concurrency.lockutils [req-6eb40d2b-d39c-4319-b3c8-e9ea422248ec req-96f523c0-16d3-4a92-889e-5dd73c8de79b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.184 226890 INFO nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Creating config drive at /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.192 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52ogfxdp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:7c:1d 10.100.0.8
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:7c:1d 10.100.0.8
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.334 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52ogfxdp" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.361 226890 DEBUG nova.storage.rbd_utils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.365 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.504 226890 DEBUG oslo_concurrency.processutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config ce0152a6-7d4d-4eac-9587-a43ad934d9cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.505 226890 INFO nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Deleting local config drive /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc/disk.config because it was imported into RBD.#033[00m
Jan 20 09:53:35 np0005588920 kernel: tap083e3cc0-e6: entered promiscuous mode
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.5500] manager: (tap083e3cc0-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00567|binding|INFO|Claiming lport 083e3cc0-e665-4049-a47b-233abf07b9d5 for this chassis.
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00568|binding|INFO|083e3cc0-e665-4049-a47b-233abf07b9d5: Claiming fa:16:3e:6a:15:6d 10.100.0.5
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.551 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.558 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:15:6d 10.100.0.5'], port_security=['fa:16:3e:6a:15:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ce0152a6-7d4d-4eac-9587-a43ad934d9cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '800ce09e-d4c4-4be1-b862-b09f6926701e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=083e3cc0-e665-4049-a47b-233abf07b9d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.560 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 083e3cc0-e665-4049-a47b-233abf07b9d5 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce bound to our chassis#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.562 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00569|binding|INFO|Setting lport 083e3cc0-e665-4049-a47b-233abf07b9d5 ovn-installed in OVS
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00570|binding|INFO|Setting lport 083e3cc0-e665-4049-a47b-233abf07b9d5 up in Southbound
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.574 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2256f8d-8aa1-4ae2-b954-220139dc9f1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.575 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41a1a3fe-f1 in ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.577 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41a1a3fe-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.577 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9a40a0-6c03-4e39-86ca-c87718b5f997]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.577 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5328ac9d-04d5-4846-9337-3abfbd6e052b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 systemd-udevd[270206]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.590 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[74ff6e2c-bd4f-42cc-93e2-89381bed5144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.5920] device (tap083e3cc0-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.5930] device (tap083e3cc0-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:53:35 np0005588920 systemd-machined[196121]: New machine qemu-56-instance-0000007a.
Jan 20 09:53:35 np0005588920 systemd[1]: Started Virtual Machine qemu-56-instance-0000007a.
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.616 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef7e375-7cf3-4011-a51b-70372e1f5beb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.649 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[217b147f-04fa-4c0c-ab2e-d8c3d28644d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.654 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec6f8ae-920a-4176-ad97-675da4871f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.6559] manager: (tap41a1a3fe-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/286)
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.681 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a042c6a8-03fe-46de-abad-4a70dd6c6f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.684 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba49cef-e643-4539-8ddc-bf4a938190d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.7097] device (tap41a1a3fe-f0): carrier: link connected
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.715 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1c06e630-02f6-4103-a31d-d5be531edcef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.731 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[99f6fbe1-a0a2-42bf-a7f0-0675783de196]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270240, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.750 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b394102d-1f89-48c0-8b34-cb5005bf9955]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:1fb5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585739, 'tstamp': 585739}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270241, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.767 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bf17cbf3-4116-4991-8927-0cd9c047023c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270256, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.806 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[67217f15-c858-4f3e-bb53-e6b39f37290b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.876 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26c6cb58-d7b8-4ed8-a5da-220aef03967f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.878 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.878 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.878 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:35 np0005588920 kernel: tap41a1a3fe-f0: entered promiscuous mode
Jan 20 09:53:35 np0005588920 NetworkManager[49076]: <info>  [1768920815.8823] manager: (tap41a1a3fe-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.880 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.883 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.884 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:35Z|00571|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.886 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.887 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb8ca04-6851-46a3-99f9-f3d2f08266e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.888 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:53:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:35.889 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'env', 'PROCESS_TAG=haproxy-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.944 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920815.9439597, ce0152a6-7d4d-4eac-9587-a43ad934d9cc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.945 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] VM Started (Lifecycle Event)#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.967 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.971 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920815.944245, ce0152a6-7d4d-4eac-9587-a43ad934d9cc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.971 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.989 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.993 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:35 np0005588920 nova_compute[226886]: 2026-01-20 14:53:35.996 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.015 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.049 226890 DEBUG nova.compute.manager [req-e970fef0-b0ab-4a5e-ab8f-41926281a7a1 req-ebcd1483-53b1-456f-bc13-037a817f6cba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.049 226890 DEBUG oslo_concurrency.lockutils [req-e970fef0-b0ab-4a5e-ab8f-41926281a7a1 req-ebcd1483-53b1-456f-bc13-037a817f6cba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.049 226890 DEBUG oslo_concurrency.lockutils [req-e970fef0-b0ab-4a5e-ab8f-41926281a7a1 req-ebcd1483-53b1-456f-bc13-037a817f6cba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.050 226890 DEBUG oslo_concurrency.lockutils [req-e970fef0-b0ab-4a5e-ab8f-41926281a7a1 req-ebcd1483-53b1-456f-bc13-037a817f6cba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.050 226890 DEBUG nova.compute.manager [req-e970fef0-b0ab-4a5e-ab8f-41926281a7a1 req-ebcd1483-53b1-456f-bc13-037a817f6cba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Processing event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.051 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.054 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920816.0537233, ce0152a6-7d4d-4eac-9587-a43ad934d9cc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.054 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.060 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.063 226890 INFO nova.virt.libvirt.driver [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Instance spawned successfully.#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.064 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.075 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.081 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.085 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.085 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.085 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.086 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.086 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.086 226890 DEBUG nova.virt.libvirt.driver [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:36.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.112 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.151 226890 INFO nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Took 9.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.151 226890 DEBUG nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.217 226890 INFO nova.compute.manager [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Took 10.56 seconds to build instance.#033[00m
Jan 20 09:53:36 np0005588920 nova_compute[226886]: 2026-01-20 14:53:36.233 226890 DEBUG oslo_concurrency.lockutils [None req-0e636d62-4008-4730-9cdc-290c8cc9f9ca 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:36 np0005588920 podman[270314]: 2026-01-20 14:53:36.279819801 +0000 UTC m=+0.051233705 container create 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:53:36 np0005588920 systemd[1]: Started libpod-conmon-303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8.scope.
Jan 20 09:53:36 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:53:36 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33682d6e7c63feaddac6a82c3caf43b99746288732ae100d8bacf369bba08d8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:53:36 np0005588920 podman[270314]: 2026-01-20 14:53:36.255152786 +0000 UTC m=+0.026566710 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:53:36 np0005588920 podman[270314]: 2026-01-20 14:53:36.354517375 +0000 UTC m=+0.125931309 container init 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:53:36 np0005588920 podman[270314]: 2026-01-20 14:53:36.359540276 +0000 UTC m=+0.130954180 container start 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 09:53:36 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [NOTICE]   (270332) : New worker (270334) forked
Jan 20 09:53:36 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [NOTICE]   (270332) : Loading success.
Jan 20 09:53:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:37.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:37 np0005588920 nova_compute[226886]: 2026-01-20 14:53:37.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:37 np0005588920 nova_compute[226886]: 2026-01-20 14:53:37.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:53:37 np0005588920 nova_compute[226886]: 2026-01-20 14:53:37.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.059 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.059 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.059 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.060 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:38.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.173 226890 DEBUG nova.compute.manager [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.174 226890 DEBUG oslo_concurrency.lockutils [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.174 226890 DEBUG oslo_concurrency.lockutils [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.174 226890 DEBUG oslo_concurrency.lockutils [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.174 226890 DEBUG nova.compute.manager [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] No waiting events found dispatching network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:38 np0005588920 nova_compute[226886]: 2026-01-20 14:53:38.175 226890 WARNING nova.compute.manager [req-fde28adc-7ccc-45a6-8a00-14f949aebc35 req-dd7ee047-df59-4ac1-bf3c-1779039d6a14 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received unexpected event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:53:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:39.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:39 np0005588920 nova_compute[226886]: 2026-01-20 14:53:39.173 226890 INFO nova.compute.manager [None req-83d273ed-942f-4f15-885b-094e2b52d34a 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Get console output#033[00m
Jan 20 09:53:39 np0005588920 nova_compute[226886]: 2026-01-20 14:53:39.178 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 09:53:39 np0005588920 nova_compute[226886]: 2026-01-20 14:53:39.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:40.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:40 np0005588920 nova_compute[226886]: 2026-01-20 14:53:40.989 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.018 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [{"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.042 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.042 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.042 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:41.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.747 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:53:41 np0005588920 nova_compute[226886]: 2026-01-20 14:53:41.748 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:42.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3957894210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.247 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.345 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.346 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000078 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.350 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.350 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.353 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.353 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.521 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.523 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3865MB free_disk=20.693958282470703GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.523 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.523 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.666 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 49919d3f-fab0-404f-a0a0-82610973a254 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.757 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.826 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.826 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.849 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:53:42 np0005588920 nova_compute[226886]: 2026-01-20 14:53:42.944 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.032 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:43.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2236801101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.474 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.480 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.495 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.820 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.820 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:43 np0005588920 nova_compute[226886]: 2026-01-20 14:53:43.821 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:44.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.842 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.842 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.876 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.876 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.877 226890 INFO nova.compute.manager [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Shelving#033[00m
Jan 20 09:53:44 np0005588920 nova_compute[226886]: 2026-01-20 14:53:44.895 226890 DEBUG nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:53:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:45.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:45 np0005588920 nova_compute[226886]: 2026-01-20 14:53:45.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:45 np0005588920 nova_compute[226886]: 2026-01-20 14:53:45.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:46.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:46 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #51. Immutable memtables: 8.
Jan 20 09:53:46 np0005588920 nova_compute[226886]: 2026-01-20 14:53:46.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:46 np0005588920 nova_compute[226886]: 2026-01-20 14:53:46.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:53:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:47.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:47 np0005588920 kernel: tap54d190c9-33 (unregistering): left promiscuous mode
Jan 20 09:53:47 np0005588920 NetworkManager[49076]: <info>  [1768920827.1255] device (tap54d190c9-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:47Z|00572|binding|INFO|Releasing lport 54d190c9-33af-46a2-a141-ff83769d93c6 from this chassis (sb_readonly=0)
Jan 20 09:53:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:47Z|00573|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 down in Southbound
Jan 20 09:53:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:47Z|00574|binding|INFO|Removing iface tap54d190c9-33 ovn-installed in OVS
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.145 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:7c:1d 10.100.0.8'], port_security=['fa:16:3e:1f:7c:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49919d3f-fab0-404f-a0a0-82610973a254', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da0e73ee-8414-4d81-a0bf-09363bb8a1b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=54d190c9-33af-46a2-a141-ff83769d93c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.146 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 54d190c9-33af-46a2-a141-ff83769d93c6 in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 unbound from our chassis#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.148 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9589011-b728-4b79-9945-aa6c52dd0fc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.149 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39610efa-0db9-46da-b38f-5472fbf106c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.150 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace which is not needed anymore#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 20 09:53:47 np0005588920 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000078.scope: Consumed 14.107s CPU time.
Jan 20 09:53:47 np0005588920 systemd-machined[196121]: Machine qemu-55-instance-00000078 terminated.
Jan 20 09:53:47 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [NOTICE]   (269850) : haproxy version is 2.8.14-c23fe91
Jan 20 09:53:47 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [NOTICE]   (269850) : path to executable is /usr/sbin/haproxy
Jan 20 09:53:47 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [WARNING]  (269850) : Exiting Master process...
Jan 20 09:53:47 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [ALERT]    (269850) : Current worker (269852) exited with code 143 (Terminated)
Jan 20 09:53:47 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[269846]: [WARNING]  (269850) : All workers exited. Exiting... (0)
Jan 20 09:53:47 np0005588920 systemd[1]: libpod-c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809.scope: Deactivated successfully.
Jan 20 09:53:47 np0005588920 podman[270412]: 2026-01-20 14:53:47.299653634 +0000 UTC m=+0.045771110 container died c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 09:53:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809-userdata-shm.mount: Deactivated successfully.
Jan 20 09:53:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c715ea318109a6ff06825ed6abfff1b2d73cd44b23f7f2047d0a53d5ddf310ca-merged.mount: Deactivated successfully.
Jan 20 09:53:47 np0005588920 podman[270412]: 2026-01-20 14:53:47.338436247 +0000 UTC m=+0.084553723 container cleanup c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:53:47 np0005588920 systemd[1]: libpod-conmon-c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809.scope: Deactivated successfully.
Jan 20 09:53:47 np0005588920 podman[270442]: 2026-01-20 14:53:47.407463621 +0000 UTC m=+0.044381601 container remove c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.414 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d538266-75dc-41c4-86b7-edfbf631cb28]: (4, ('Tue Jan 20 02:53:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809)\nc339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809\nTue Jan 20 02:53:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (c339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809)\nc339d9a29b39cecda57dd59a5681d306e7c63d519488225a9b03b77f32b8c809\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.416 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[be796c49-5e31-443b-8a3a-faa92cad5299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.417 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.419 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.434 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 kernel: tape9589011-b0: left promiscuous mode
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.444 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbc0988-b855-49c9-9f8c-c4123a895b0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.460 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[69f9696d-ff58-47db-9a3c-4ede1af049e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.462 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6e70a6bf-94dd-4342-adb5-2fe3ea04d69e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.479 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b74996d2-a587-4336-a7dc-659a26796e38]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584110, 'reachable_time': 39442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270474, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 systemd[1]: run-netns-ovnmeta\x2de9589011\x2db728\x2d4b79\x2d9945\x2daa6c52dd0fc2.mount: Deactivated successfully.
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.484 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:53:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:53:47.484 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0008d334-7dbe-439e-826d-c9aa9f785443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.744 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.909 226890 INFO nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.916 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance destroyed successfully.#033[00m
Jan 20 09:53:47 np0005588920 nova_compute[226886]: 2026-01-20 14:53:47.916 226890 DEBUG nova.objects.instance [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'numa_topology' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:48.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.450 226890 INFO nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Beginning cold snapshot process#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.637 226890 DEBUG nova.virt.libvirt.imagebackend [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.725 226890 DEBUG nova.compute.manager [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.726 226890 DEBUG oslo_concurrency.lockutils [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.726 226890 DEBUG oslo_concurrency.lockutils [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.726 226890 DEBUG oslo_concurrency.lockutils [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.727 226890 DEBUG nova.compute.manager [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.727 226890 WARNING nova.compute.manager [req-886cb4f4-7442-4fb0-81cb-7149624e3c42 req-2d87c1ee-2bd8-4a96-abee-aa4b0faa3f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received unexpected event network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:53:48 np0005588920 nova_compute[226886]: 2026-01-20 14:53:48.886 226890 DEBUG nova.storage.rbd_utils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] creating snapshot(c8ffdd4946494fcebd789799e555d0e6) on rbd image(49919d3f-fab0-404f-a0a0-82610973a254_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:53:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:49.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 20 09:53:49 np0005588920 nova_compute[226886]: 2026-01-20 14:53:49.146 226890 DEBUG nova.storage.rbd_utils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] cloning vms/49919d3f-fab0-404f-a0a0-82610973a254_disk@c8ffdd4946494fcebd789799e555d0e6 to images/6e97db8b-a462-4791-9edf-594ed0f547e6 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:53:49 np0005588920 nova_compute[226886]: 2026-01-20 14:53:49.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588920 nova_compute[226886]: 2026-01-20 14:53:49.268 226890 DEBUG nova.storage.rbd_utils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] flattening images/6e97db8b-a462-4791-9edf-594ed0f547e6 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:53:49 np0005588920 nova_compute[226886]: 2026-01-20 14:53:49.678 226890 DEBUG nova.storage.rbd_utils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] removing snapshot(c8ffdd4946494fcebd789799e555d0e6) on rbd image(49919d3f-fab0-404f-a0a0-82610973a254_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:53:50 np0005588920 podman[270598]: 2026-01-20 14:53:50.015350169 +0000 UTC m=+0.092708672 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:53:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:50.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 20 09:53:50 np0005588920 nova_compute[226886]: 2026-01-20 14:53:50.158 226890 DEBUG nova.storage.rbd_utils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] creating snapshot(snap) on rbd image(6e97db8b-a462-4791-9edf-594ed0f547e6) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:53:50 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:50Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:15:6d 10.100.0.5
Jan 20 09:53:50 np0005588920 ovn_controller[133971]: 2026-01-20T14:53:50Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:15:6d 10.100.0.5
Jan 20 09:53:50 np0005588920 nova_compute[226886]: 2026-01-20 14:53:50.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:50 np0005588920 nova_compute[226886]: 2026-01-20 14:53:50.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:53:50 np0005588920 nova_compute[226886]: 2026-01-20 14:53:50.993 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:51.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.245 226890 DEBUG nova.compute.manager [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.245 226890 DEBUG oslo_concurrency.lockutils [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.246 226890 DEBUG oslo_concurrency.lockutils [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.246 226890 DEBUG oslo_concurrency.lockutils [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.247 226890 DEBUG nova.compute.manager [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:51 np0005588920 nova_compute[226886]: 2026-01-20 14:53:51.247 226890 WARNING nova.compute.manager [req-c31ea950-efca-40f7-ae8f-f0488c5ea543 req-5e0dad00-be0b-4d4c-a468-200f6d40eeb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received unexpected event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:53:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:52.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.669 226890 INFO nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Snapshot image upload complete#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.670 226890 DEBUG nova.compute.manager [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.724 226890 INFO nova.compute.manager [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Shelve offloading#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.735 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance destroyed successfully.#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.736 226890 DEBUG nova.compute.manager [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.739 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.740 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:52 np0005588920 nova_compute[226886]: 2026-01-20 14:53:52.740 226890 DEBUG nova.network.neutron [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:53.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:54.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:54 np0005588920 nova_compute[226886]: 2026-01-20 14:53:54.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:54 np0005588920 nova_compute[226886]: 2026-01-20 14:53:54.681 226890 DEBUG nova.network.neutron [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:54 np0005588920 nova_compute[226886]: 2026-01-20 14:53:54.745 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:55.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:55 np0005588920 nova_compute[226886]: 2026-01-20 14:53:55.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:56.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.322 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance destroyed successfully.#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.322 226890 DEBUG nova.objects.instance [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'resources' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.338 226890 DEBUG nova.virt.libvirt.vif [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGm3YXyDls3m8lgQsMp7i5z2Ji2kt+QoAKyNgN4cUeQAncl8sITzJAcvU8MP7QXpcIT5PJILYBp9zVzJhusCSqycT+8/Be6bl9GRyoq123x5/AtCBhaSdlyObjct+Gfsjw==',key_name='tempest-keypair-99379181',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:53:52.670018',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='6e97db8b-a462-4791-9edf-594ed0f547e6'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:53:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.339 226890 DEBUG nova.network.os_vif_util [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.340 226890 DEBUG nova.network.os_vif_util [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.341 226890 DEBUG os_vif [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.346 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54d190c9-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.353 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.355 226890 INFO os_vif [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33')#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.756 226890 INFO nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deleting instance files /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254_del#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.758 226890 INFO nova.virt.libvirt.driver [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deletion of /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254_del complete#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.850 226890 INFO nova.scheduler.client.report [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Deleted allocations for instance 49919d3f-fab0-404f-a0a0-82610973a254#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.906 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.906 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:56 np0005588920 nova_compute[226886]: 2026-01-20 14:53:56.990 226890 DEBUG oslo_concurrency.processutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:57.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/433259130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:57 np0005588920 nova_compute[226886]: 2026-01-20 14:53:57.435 226890 DEBUG oslo_concurrency.processutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:57 np0005588920 nova_compute[226886]: 2026-01-20 14:53:57.444 226890 DEBUG nova.compute.provider_tree [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:57 np0005588920 nova_compute[226886]: 2026-01-20 14:53:57.465 226890 DEBUG nova.scheduler.client.report [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:57 np0005588920 nova_compute[226886]: 2026-01-20 14:53:57.488 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:57 np0005588920 nova_compute[226886]: 2026-01-20 14:53:57.571 226890 DEBUG oslo_concurrency.lockutils [None req-46eeafb9-3aa7-4580-a8e5-9c0a235f1310 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:58.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:53:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:59.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 20 09:53:59 np0005588920 podman[270683]: 2026-01-20 14:53:59.996304628 +0000 UTC m=+0.085259243 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:54:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:00.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:00 np0005588920 nova_compute[226886]: 2026-01-20 14:54:00.998 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:01.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:01 np0005588920 nova_compute[226886]: 2026-01-20 14:54:01.348 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 20 09:54:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:02.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:02 np0005588920 nova_compute[226886]: 2026-01-20 14:54:02.379 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920827.3780966, 49919d3f-fab0-404f-a0a0-82610973a254 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:02 np0005588920 nova_compute[226886]: 2026-01-20 14:54:02.380 226890 INFO nova.compute.manager [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:54:02 np0005588920 nova_compute[226886]: 2026-01-20 14:54:02.415 226890 DEBUG nova.compute.manager [None req-fce8f790-c2d8-4404-a1c1-85dc4c6c4d0d - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:03.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.498 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.499 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.499 226890 INFO nova.compute.manager [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Unshelving#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.588 226890 INFO nova.virt.block_device [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Booting with volume 763a536d-0897-462a-8acb-fbf9e84e31ea at /dev/vdc#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.726 226890 DEBUG os_brick.utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.728 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.741 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.742 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7f3896-e96e-4dd6-b1cd-dce94e85dcca]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.744 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.753 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.754 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[28dd086d-c103-42ed-9536-713aaa5d8964]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.757 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.765 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.765 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[e46b9f8d-a568-47f0-aee9-a6a35230b12d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.767 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dfd810-4ef5-4465-96bf-e3847eaaa1cf]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.768 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.796 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.798 226890 DEBUG os_brick.initiator.connectors.lightos [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.798 226890 DEBUG os_brick.initiator.connectors.lightos [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.799 226890 DEBUG os_brick.initiator.connectors.lightos [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.799 226890 DEBUG os_brick.utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] <== get_connector_properties: return (72ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:54:03 np0005588920 nova_compute[226886]: 2026-01-20 14:54:03.799 226890 DEBUG nova.virt.block_device [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating existing volume attachment record: 45e245a3-2c83-4138-8fd9-f41ae224cc33 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:54:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:04.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1756112089' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.762 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.763 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.766 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_requests' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.788 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'numa_topology' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.798 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.799 226890 INFO nova.compute.claims [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:54:04 np0005588920 nova_compute[226886]: 2026-01-20 14:54:04.961 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:05.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3370297134' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:05 np0005588920 nova_compute[226886]: 2026-01-20 14:54:05.391 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:05 np0005588920 nova_compute[226886]: 2026-01-20 14:54:05.398 226890 DEBUG nova.compute.provider_tree [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:05 np0005588920 nova_compute[226886]: 2026-01-20 14:54:05.421 226890 DEBUG nova.scheduler.client.report [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:05 np0005588920 nova_compute[226886]: 2026-01-20 14:54:05.449 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:05 np0005588920 nova_compute[226886]: 2026-01-20 14:54:05.653 226890 INFO nova.network.neutron [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating port 54d190c9-33af-46a2-a141-ff83769d93c6 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:54:05 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:05Z|00575|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 09:54:06 np0005588920 nova_compute[226886]: 2026-01-20 14:54:06.001 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:06.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:06 np0005588920 nova_compute[226886]: 2026-01-20 14:54:06.350 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:06 np0005588920 nova_compute[226886]: 2026-01-20 14:54:06.901 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:06 np0005588920 nova_compute[226886]: 2026-01-20 14:54:06.902 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:06 np0005588920 nova_compute[226886]: 2026-01-20 14:54:06.902 226890 DEBUG nova.network.neutron [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:54:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:07.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 20 09:54:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.231 226890 DEBUG nova.network.neutron [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.246 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.248 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.249 226890 INFO nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating image(s)#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.290 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.295 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.336 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.363 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.366 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "59d720980a49a885262e73c123c16bd0a36e933d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.367 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "59d720980a49a885262e73c123c16bd0a36e933d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.557 226890 DEBUG nova.compute.manager [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.557 226890 DEBUG nova.compute.manager [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing instance network info cache due to event network-changed-54d190c9-33af-46a2-a141-ff83769d93c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.557 226890 DEBUG oslo_concurrency.lockutils [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.557 226890 DEBUG oslo_concurrency.lockutils [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.558 226890 DEBUG nova.network.neutron [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Refreshing network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.584 226890 DEBUG nova.virt.libvirt.imagebackend [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/6e97db8b-a462-4791-9edf-594ed0f547e6/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/6e97db8b-a462-4791-9edf-594ed0f547e6/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.638 226890 DEBUG nova.virt.libvirt.imagebackend [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/6e97db8b-a462-4791-9edf-594ed0f547e6/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.640 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] cloning images/6e97db8b-a462-4791-9edf-594ed0f547e6@snap to None/49919d3f-fab0-404f-a0a0-82610973a254_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.757 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "59d720980a49a885262e73c123c16bd0a36e933d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.896 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'migration_context' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:08 np0005588920 nova_compute[226886]: 2026-01-20 14:54:08.957 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] flattening vms/49919d3f-fab0-404f-a0a0-82610973a254_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:54:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:54:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:09.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.399 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Image rbd:vms/49919d3f-fab0-404f-a0a0-82610973a254_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.399 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.400 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Ensure instance console log exists: /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.400 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.401 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.401 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.404 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start _get_guest_xml network_info=[{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:53:44Z,direct_url=<?>,disk_format='raw',id=6e97db8b-a462-4791-9edf-594ed0f547e6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-1885581249-shelved',owner='b1e83af992c94112a965575784639d77',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:53:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': None, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-763a536d-0897-462a-8acb-fbf9e84e31ea', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '763a536d-0897-462a-8acb-fbf9e84e31ea', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attached', 'instance': '49919d3f-fab0-404f-a0a0-82610973a254', 'attached_at': '', 'detached_at': '', 'volume_id': '763a536d-0897-462a-8acb-fbf9e84e31ea', 'serial': '763a536d-0897-462a-8acb-fbf9e84e31ea'}, 'mount_device': '/dev/vdc', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '45e245a3-2c83-4138-8fd9-f41ae224cc33', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.408 226890 WARNING nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.412 226890 DEBUG nova.virt.libvirt.host [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.413 226890 DEBUG nova.virt.libvirt.host [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.415 226890 DEBUG nova.virt.libvirt.host [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.415 226890 DEBUG nova.virt.libvirt.host [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.417 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.417 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:53:44Z,direct_url=<?>,disk_format='raw',id=6e97db8b-a462-4791-9edf-594ed0f547e6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-1885581249-shelved',owner='b1e83af992c94112a965575784639d77',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:53:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.417 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.418 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.418 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.418 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.418 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.419 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.419 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.419 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.419 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.419 226890 DEBUG nova.virt.hardware [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.420 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.433 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.740171) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849740319, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 2156, "num_deletes": 265, "total_data_size": 4584616, "memory_usage": 4652144, "flush_reason": "Manual Compaction"}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849771768, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2997055, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46998, "largest_seqno": 49149, "table_properties": {"data_size": 2987998, "index_size": 5615, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19297, "raw_average_key_size": 20, "raw_value_size": 2969678, "raw_average_value_size": 3203, "num_data_blocks": 241, "num_entries": 927, "num_filter_entries": 927, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920711, "oldest_key_time": 1768920711, "file_creation_time": 1768920849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 31622 microseconds, and 9599 cpu microseconds.
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.771829) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2997055 bytes OK
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.771855) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.773383) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.773398) EVENT_LOG_v1 {"time_micros": 1768920849773393, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.773418) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 4574827, prev total WAL file size 4574827, number of live WAL files 2.
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.774722) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353133' seq:72057594037927935, type:22 .. '6C6F676D0031373635' seq:0, type:0; will stop at (end)
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2926KB)], [90(10MB)]
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849774768, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 13791929, "oldest_snapshot_seqno": -1}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/912464231' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.893 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7595 keys, 13632833 bytes, temperature: kUnknown
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849929799, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 13632833, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13578966, "index_size": 33787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19013, "raw_key_size": 195215, "raw_average_key_size": 25, "raw_value_size": 13440287, "raw_average_value_size": 1769, "num_data_blocks": 1347, "num_entries": 7595, "num_filter_entries": 7595, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.930095) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 13632833 bytes
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.931479) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 88.9 rd, 87.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 10.3 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(9.2) write-amplify(4.5) OK, records in: 8138, records dropped: 543 output_compression: NoCompression
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.931514) EVENT_LOG_v1 {"time_micros": 1768920849931498, "job": 56, "event": "compaction_finished", "compaction_time_micros": 155120, "compaction_time_cpu_micros": 27800, "output_level": 6, "num_output_files": 1, "total_output_size": 13632833, "num_input_records": 8138, "num_output_records": 7595, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849932645, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.932 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849937180, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.774629) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.937308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.937315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.937318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.937320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:09.937322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588920 nova_compute[226886]: 2026-01-20 14:54:09.938 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:54:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:54:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:10.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2126179071' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.386 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.424 226890 DEBUG nova.virt.libvirt.vif [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='6e97db8b-a462-4791-9edf-594ed0f547e6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-99379181',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:53:52.670018',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='6e97db8b-a462-4791-9edf-594ed0f547e6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:54:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.424 226890 DEBUG nova.network.os_vif_util [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.425 226890 DEBUG nova.network.os_vif_util [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.426 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.450 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <uuid>49919d3f-fab0-404f-a0a0-82610973a254</uuid>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <name>instance-00000078</name>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-1885581249</nova:name>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:54:09</nova:creationTime>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:user uuid="34eb73f628994c11801d447148d5f142">tempest-AttachVolumeShelveTestJSON-896995479-project-member</nova:user>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:project uuid="b1e83af992c94112a965575784639d77">tempest-AttachVolumeShelveTestJSON-896995479</nova:project>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="6e97db8b-a462-4791-9edf-594ed0f547e6"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <nova:port uuid="54d190c9-33af-46a2-a141-ff83769d93c6">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="serial">49919d3f-fab0-404f-a0a0-82610973a254</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="uuid">49919d3f-fab0-404f-a0a0-82610973a254</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:54:10 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/49919d3f-fab0-404f-a0a0-82610973a254_disk">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/49919d3f-fab0-404f-a0a0-82610973a254_disk.config">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:54:10 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-763a536d-0897-462a-8acb-fbf9e84e31ea">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <target dev="vdc" bus="virtio"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <serial>763a536d-0897-462a-8acb-fbf9e84e31ea</serial>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1f:7c:1d"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <target dev="tap54d190c9-33"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/console.log" append="off"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:54:10 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:54:10 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:54:10 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:54:10 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.451 226890 DEBUG nova.compute.manager [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Preparing to wait for external event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.451 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.452 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.452 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.453 226890 DEBUG nova.virt.libvirt.vif [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='6e97db8b-a462-4791-9edf-594ed0f547e6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-99379181',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:53:52.670018',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='6e97db8b-a462-4791-9edf-594ed0f547e6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:54:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.454 226890 DEBUG nova.network.os_vif_util [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.455 226890 DEBUG nova.network.os_vif_util [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.456 226890 DEBUG os_vif [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.457 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.458 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.458 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.461 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.461 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54d190c9-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.462 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54d190c9-33, col_values=(('external_ids', {'iface-id': '54d190c9-33af-46a2-a141-ff83769d93c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:7c:1d', 'vm-uuid': '49919d3f-fab0-404f-a0a0-82610973a254'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:10 np0005588920 NetworkManager[49076]: <info>  [1768920850.4649] manager: (tap54d190c9-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.473 226890 INFO os_vif [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33')#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.554 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.555 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.555 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.555 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No VIF found with MAC fa:16:3e:1f:7c:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.556 226890 INFO nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Using config drive#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.590 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.614 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:10 np0005588920 nova_compute[226886]: 2026-01-20 14:54:10.645 226890 DEBUG nova.objects.instance [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'keypairs' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:11 np0005588920 nova_compute[226886]: 2026-01-20 14:54:11.003 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:11.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.027 226890 INFO nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Creating config drive at /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.033 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumoz7q4g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:12.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.165 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumoz7q4g" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.194 226890 DEBUG nova.storage.rbd_utils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 49919d3f-fab0-404f-a0a0-82610973a254_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.197 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config 49919d3f-fab0-404f-a0a0-82610973a254_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.365 226890 DEBUG oslo_concurrency.processutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config 49919d3f-fab0-404f-a0a0-82610973a254_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.366 226890 INFO nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deleting local config drive /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254/disk.config because it was imported into RBD.#033[00m
Jan 20 09:54:12 np0005588920 kernel: tap54d190c9-33: entered promiscuous mode
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.4098] manager: (tap54d190c9-33): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:12Z|00576|binding|INFO|Claiming lport 54d190c9-33af-46a2-a141-ff83769d93c6 for this chassis.
Jan 20 09:54:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:12Z|00577|binding|INFO|54d190c9-33af-46a2-a141-ff83769d93c6: Claiming fa:16:3e:1f:7c:1d 10.100.0.8
Jan 20 09:54:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:12Z|00578|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 ovn-installed in OVS
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.428 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.432 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:12Z|00579|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 up in Southbound
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.436 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:7c:1d 10.100.0.8'], port_security=['fa:16:3e:1f:7c:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49919d3f-fab0-404f-a0a0-82610973a254', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'da0e73ee-8414-4d81-a0bf-09363bb8a1b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=54d190c9-33af-46a2-a141-ff83769d93c6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.437 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 54d190c9-33af-46a2-a141-ff83769d93c6 in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 bound to our chassis#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.439 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9589011-b728-4b79-9945-aa6c52dd0fc2#033[00m
Jan 20 09:54:12 np0005588920 systemd-udevd[271210]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.450 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b1bfac6d-d35d-4df7-a0c2-7be49897ab4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.453 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9589011-b1 in ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:54:12 np0005588920 systemd-machined[196121]: New machine qemu-57-instance-00000078.
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.455 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9589011-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.455 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82d09765-715c-4f52-82b9-0f34ac4fc91c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.4568] device (tap54d190c9-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.4572] device (tap54d190c9-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.458 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ec1b6f-51a5-46dd-8b85-0c0e8e188fe6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 systemd[1]: Started Virtual Machine qemu-57-instance-00000078.
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.470 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[f28b32cb-00a0-4642-a93b-c00ea492d760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.494 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f457d23-b928-4221-bfa5-32aa32a820a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.536 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[53badfe4-eecf-461e-a9cc-0858251ff757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.543 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[067dc8c3-f61a-4c46-afbe-6b3731ac32a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.5439] manager: (tape9589011-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.572 226890 DEBUG nova.network.neutron [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updated VIF entry in instance network info cache for port 54d190c9-33af-46a2-a141-ff83769d93c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.571 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[24d585ac-f43c-4c09-959a-f004b2124025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.572 226890 DEBUG nova.network.neutron [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [{"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.574 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d99334-2d9e-49aa-91d3-88561fdf3ed3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.6004] device (tape9589011-b0): carrier: link connected
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.605 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[165e4612-bdd0-4d0c-8b1f-fe3adbfe246d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.620 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9b942010-c7b2-4bd3-b3b5-7826716875bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589428, 'reachable_time': 41139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271244, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.621 226890 DEBUG oslo_concurrency.lockutils [req-7ab0d581-82a7-4f0f-9677-5b9810449324 req-26e6a5b2-fbc5-4b92-a389-df9129a74848 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-49919d3f-fab0-404f-a0a0-82610973a254" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.636 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4f42705d-9912-4b31-b9e9-98e2aabc775a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:5a14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589428, 'tstamp': 589428}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271245, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.650 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbeae8c-14f9-4cbb-94ab-0e1cbc336384]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589428, 'reachable_time': 41139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271246, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.679 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78cb0e18-15d9-4984-b641-acba602932e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.737 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e68db4f4-7d71-4238-a945-c03efff9983e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.738 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.738 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.739 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9589011-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:12 np0005588920 kernel: tape9589011-b0: entered promiscuous mode
Jan 20 09:54:12 np0005588920 NetworkManager[49076]: <info>  [1768920852.7414] manager: (tape9589011-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.740 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.744 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9589011-b0, col_values=(('external_ids', {'iface-id': '9ca9d06a-9365-4769-a2c4-7322625683ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:12Z|00580|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.763 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.764 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.765 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d81b4cc-e1de-437f-bba3-ac40d016c960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.765 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:54:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:12.766 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'env', 'PROCESS_TAG=haproxy-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9589011-b728-4b79-9945-aa6c52dd0fc2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.876 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920852.8760517, 49919d3f-fab0-404f-a0a0-82610973a254 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.876 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Started (Lifecycle Event)#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.897 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.901 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920852.8762658, 49919d3f-fab0-404f-a0a0-82610973a254 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.902 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.918 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.922 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.925 226890 DEBUG nova.compute.manager [req-ed7d94a8-9241-4185-93f9-5d23eb7ba27a req-fc28c66a-d6ff-4ce2-99ec-dd3f220a61e4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.926 226890 DEBUG oslo_concurrency.lockutils [req-ed7d94a8-9241-4185-93f9-5d23eb7ba27a req-fc28c66a-d6ff-4ce2-99ec-dd3f220a61e4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.926 226890 DEBUG oslo_concurrency.lockutils [req-ed7d94a8-9241-4185-93f9-5d23eb7ba27a req-fc28c66a-d6ff-4ce2-99ec-dd3f220a61e4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.926 226890 DEBUG oslo_concurrency.lockutils [req-ed7d94a8-9241-4185-93f9-5d23eb7ba27a req-fc28c66a-d6ff-4ce2-99ec-dd3f220a61e4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.926 226890 DEBUG nova.compute.manager [req-ed7d94a8-9241-4185-93f9-5d23eb7ba27a req-fc28c66a-d6ff-4ce2-99ec-dd3f220a61e4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Processing event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.927 226890 DEBUG nova.compute.manager [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.931 226890 DEBUG nova.virt.libvirt.driver [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.934 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance spawned successfully.#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.940 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.940 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920852.930944, 49919d3f-fab0-404f-a0a0-82610973a254 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.941 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.969 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.972 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:12 np0005588920 nova_compute[226886]: 2026-01-20 14:54:12.996 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:13.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:13 np0005588920 podman[271335]: 2026-01-20 14:54:13.157424027 +0000 UTC m=+0.075323053 container create ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:54:13 np0005588920 systemd[1]: Started libpod-conmon-ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05.scope.
Jan 20 09:54:13 np0005588920 podman[271335]: 2026-01-20 14:54:13.11174479 +0000 UTC m=+0.029644196 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:54:13 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:54:13 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac631e97b2df31c4e6be68edab955c472772e4db3d7e45a234653d6eeb8e3552/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:54:13 np0005588920 podman[271335]: 2026-01-20 14:54:13.234063016 +0000 UTC m=+0.151962062 container init ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:54:13 np0005588920 podman[271335]: 2026-01-20 14:54:13.244447228 +0000 UTC m=+0.162346254 container start ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:54:13 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [NOTICE]   (271354) : New worker (271356) forked
Jan 20 09:54:13 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [NOTICE]   (271354) : Loading success.
Jan 20 09:54:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:14.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 20 09:54:14 np0005588920 nova_compute[226886]: 2026-01-20 14:54:14.703 226890 DEBUG nova.compute.manager [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 20 09:54:14 np0005588920 nova_compute[226886]: 2026-01-20 14:54:14.793 226890 DEBUG oslo_concurrency.lockutils [None req-3c1a6e31-e3a3-41e7-8344-b9c689ceeb52 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.044 226890 DEBUG nova.compute.manager [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.045 226890 DEBUG oslo_concurrency.lockutils [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.045 226890 DEBUG oslo_concurrency.lockutils [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.045 226890 DEBUG oslo_concurrency.lockutils [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.045 226890 DEBUG nova.compute.manager [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.045 226890 WARNING nova.compute.manager [req-79f0d874-5a1f-4a26-a8e0-bf8a00de8fbb req-0c226ffa-1edb-45ac-9a56-c5529525992c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received unexpected event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:15.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:15 np0005588920 nova_compute[226886]: 2026-01-20 14:54:15.465 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588920 nova_compute[226886]: 2026-01-20 14:54:16.004 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:16.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:16.456 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:16.457 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:16.457 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:17.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:18.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:19.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:20.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:20 np0005588920 nova_compute[226886]: 2026-01-20 14:54:20.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:21 np0005588920 nova_compute[226886]: 2026-01-20 14:54:21.006 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:21 np0005588920 podman[271415]: 2026-01-20 14:54:21.008110282 +0000 UTC m=+0.089092300 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:54:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:21.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:22.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:23.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:24.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 20 09:54:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:25.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:25 np0005588920 nova_compute[226886]: 2026-01-20 14:54:25.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:25Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:7c:1d 10.100.0.8
Jan 20 09:54:26 np0005588920 nova_compute[226886]: 2026-01-20 14:54:26.009 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:26.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:27.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:28.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:29.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:30.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:30 np0005588920 nova_compute[226886]: 2026-01-20 14:54:30.557 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:31 np0005588920 nova_compute[226886]: 2026-01-20 14:54:31.013 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:31 np0005588920 podman[271441]: 2026-01-20 14:54:31.036026166 +0000 UTC m=+0.098728902 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 20 09:54:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:31.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:32.088 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:32.089 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:54:32 np0005588920 nova_compute[226886]: 2026-01-20 14:54:32.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:32.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 20 09:54:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.263 226890 DEBUG oslo_concurrency.lockutils [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.264 226890 DEBUG oslo_concurrency.lockutils [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.282 226890 INFO nova.compute.manager [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Detaching volume 763a536d-0897-462a-8acb-fbf9e84e31ea#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.469 226890 INFO nova.virt.block_device [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Attempting to driver detach volume 763a536d-0897-462a-8acb-fbf9e84e31ea from mountpoint /dev/vdc#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.481 226890 DEBUG nova.virt.libvirt.driver [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Attempting to detach device vdc from instance 49919d3f-fab0-404f-a0a0-82610973a254 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.481 226890 DEBUG nova.virt.libvirt.guest [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-763a536d-0897-462a-8acb-fbf9e84e31ea">
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <serial>763a536d-0897-462a-8acb-fbf9e84e31ea</serial>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:54:33 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.490 226890 INFO nova.virt.libvirt.driver [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully detached device vdc from instance 49919d3f-fab0-404f-a0a0-82610973a254 from the persistent domain config.#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.491 226890 DEBUG nova.virt.libvirt.driver [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 49919d3f-fab0-404f-a0a0-82610973a254 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.492 226890 DEBUG nova.virt.libvirt.guest [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-763a536d-0897-462a-8acb-fbf9e84e31ea">
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <serial>763a536d-0897-462a-8acb-fbf9e84e31ea</serial>
Jan 20 09:54:33 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 09:54:33 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:54:33 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.604 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768920873.6042657, 49919d3f-fab0-404f-a0a0-82610973a254 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.606 226890 DEBUG nova.virt.libvirt.driver [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 49919d3f-fab0-404f-a0a0-82610973a254 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.608 226890 INFO nova.virt.libvirt.driver [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully detached device vdc from instance 49919d3f-fab0-404f-a0a0-82610973a254 from the live domain config.#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.857 226890 DEBUG nova.objects.instance [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'flavor' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:33 np0005588920 nova_compute[226886]: 2026-01-20 14:54:33.907 226890 DEBUG oslo_concurrency.lockutils [None req-8c3fa36c-dd9b-4db0-b716-76911398958d 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:34.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.704 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.705 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.706 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.706 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.706 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.707 226890 INFO nova.compute.manager [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Terminating instance#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.708 226890 DEBUG nova.compute.manager [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:54:34 np0005588920 kernel: tap54d190c9-33 (unregistering): left promiscuous mode
Jan 20 09:54:34 np0005588920 NetworkManager[49076]: <info>  [1768920874.7521] device (tap54d190c9-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.762 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:34 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:34Z|00581|binding|INFO|Releasing lport 54d190c9-33af-46a2-a141-ff83769d93c6 from this chassis (sb_readonly=0)
Jan 20 09:54:34 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:34Z|00582|binding|INFO|Setting lport 54d190c9-33af-46a2-a141-ff83769d93c6 down in Southbound
Jan 20 09:54:34 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:34Z|00583|binding|INFO|Removing iface tap54d190c9-33 ovn-installed in OVS
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.765 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:34.771 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:7c:1d 10.100.0.8'], port_security=['fa:16:3e:1f:7c:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49919d3f-fab0-404f-a0a0-82610973a254', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'da0e73ee-8414-4d81-a0bf-09363bb8a1b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.210', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=54d190c9-33af-46a2-a141-ff83769d93c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:34.774 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 54d190c9-33af-46a2-a141-ff83769d93c6 in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 unbound from our chassis#033[00m
Jan 20 09:54:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:34.776 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9589011-b728-4b79-9945-aa6c52dd0fc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:34.778 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f031916d-5598-4e1e-b27f-ceaa1c2bbae6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:34.780 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace which is not needed anymore#033[00m
Jan 20 09:54:34 np0005588920 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 20 09:54:34 np0005588920 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000078.scope: Consumed 14.049s CPU time.
Jan 20 09:54:34 np0005588920 systemd-machined[196121]: Machine qemu-57-instance-00000078 terminated.
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [NOTICE]   (271354) : haproxy version is 2.8.14-c23fe91
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [NOTICE]   (271354) : path to executable is /usr/sbin/haproxy
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [WARNING]  (271354) : Exiting Master process...
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [WARNING]  (271354) : Exiting Master process...
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [ALERT]    (271354) : Current worker (271356) exited with code 143 (Terminated)
Jan 20 09:54:34 np0005588920 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[271350]: [WARNING]  (271354) : All workers exited. Exiting... (0)
Jan 20 09:54:34 np0005588920 systemd[1]: libpod-ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05.scope: Deactivated successfully.
Jan 20 09:54:34 np0005588920 podman[271487]: 2026-01-20 14:54:34.915371297 +0000 UTC m=+0.049221538 container died ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.943 226890 INFO nova.virt.libvirt.driver [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Instance destroyed successfully.#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.944 226890 DEBUG nova.objects.instance [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'resources' on Instance uuid 49919d3f-fab0-404f-a0a0-82610973a254 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:34 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05-userdata-shm.mount: Deactivated successfully.
Jan 20 09:54:34 np0005588920 systemd[1]: var-lib-containers-storage-overlay-ac631e97b2df31c4e6be68edab955c472772e4db3d7e45a234653d6eeb8e3552-merged.mount: Deactivated successfully.
Jan 20 09:54:34 np0005588920 podman[271487]: 2026-01-20 14:54:34.954435247 +0000 UTC m=+0.088285508 container cleanup ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.959 226890 DEBUG nova.virt.libvirt.vif [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:53:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1885581249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1885581249',id=120,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGm3YXyDls3m8lgQsMp7i5z2Ji2kt+QoAKyNgN4cUeQAncl8sITzJAcvU8MP7QXpcIT5PJILYBp9zVzJhusCSqycT+8/Be6bl9GRyoq123x5/AtCBhaSdlyObjct+Gfsjw==',key_name='tempest-keypair-99379181',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-8azvvs4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=49919d3f-fab0-404f-a0a0-82610973a254,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.960 226890 DEBUG nova.network.os_vif_util [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "54d190c9-33af-46a2-a141-ff83769d93c6", "address": "fa:16:3e:1f:7c:1d", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d190c9-33", "ovs_interfaceid": "54d190c9-33af-46a2-a141-ff83769d93c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.961 226890 DEBUG nova.network.os_vif_util [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.962 226890 DEBUG os_vif [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.966 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:34 np0005588920 systemd[1]: libpod-conmon-ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05.scope: Deactivated successfully.
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.967 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54d190c9-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.971 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:34 np0005588920 nova_compute[226886]: 2026-01-20 14:54:34.976 226890 INFO os_vif [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:7c:1d,bridge_name='br-int',has_traffic_filtering=True,id=54d190c9-33af-46a2-a141-ff83769d93c6,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d190c9-33')#033[00m
Jan 20 09:54:35 np0005588920 podman[271525]: 2026-01-20 14:54:35.034260946 +0000 UTC m=+0.049624959 container remove ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.041 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[28c06a47-d70c-40a0-80e8-dc513e115e50]: (4, ('Tue Jan 20 02:54:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05)\nac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05\nTue Jan 20 02:54:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (ac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05)\nac488393e56a96995862284da5e0dccaae43e5634793282b7dc9af4bb0a9fb05\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.043 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[889bd786-9569-4d64-8789-ca622a69e2e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.044 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:35 np0005588920 kernel: tape9589011-b0: left promiscuous mode
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.059 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.063 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6772bbdd-9c0e-4382-bbf2-fead65141d29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.082 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0627e0-b198-4044-b66d-55d41adc6e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.084 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[971c3c7e-eb48-49c6-8742-2e8b6c96b849]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.108 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b811f0-060a-4464-a7b4-d68c5114703a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589421, 'reachable_time': 44045, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271557, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.111 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:54:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:35.111 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[7845e876-2f87-40bd-8918-59e7fbc507b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:35 np0005588920 systemd[1]: run-netns-ovnmeta\x2de9589011\x2db728\x2d4b79\x2d9945\x2daa6c52dd0fc2.mount: Deactivated successfully.
Jan 20 09:54:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.615 226890 DEBUG nova.compute.manager [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.616 226890 DEBUG oslo_concurrency.lockutils [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.617 226890 DEBUG oslo_concurrency.lockutils [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.617 226890 DEBUG oslo_concurrency.lockutils [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.618 226890 DEBUG nova.compute.manager [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:35 np0005588920 nova_compute[226886]: 2026-01-20 14:54:35.618 226890 DEBUG nova.compute.manager [req-f39074d1-d3a2-45f3-9fca-a17623d67ff2 req-30c8c638-184f-4a67-9fb3-6381aa83bfec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-unplugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:36.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.158 226890 INFO nova.virt.libvirt.driver [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deleting instance files /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254_del#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.158 226890 INFO nova.virt.libvirt.driver [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deletion of /var/lib/nova/instances/49919d3f-fab0-404f-a0a0-82610973a254_del complete#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.233 226890 INFO nova.compute.manager [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Took 1.52 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.234 226890 DEBUG oslo.service.loopingcall [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.234 226890 DEBUG nova.compute.manager [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:54:36 np0005588920 nova_compute[226886]: 2026-01-20 14:54:36.234 226890 DEBUG nova.network.neutron [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:54:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:37.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.747 226890 DEBUG nova.compute.manager [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.747 226890 DEBUG oslo_concurrency.lockutils [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "49919d3f-fab0-404f-a0a0-82610973a254-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.748 226890 DEBUG oslo_concurrency.lockutils [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.748 226890 DEBUG oslo_concurrency.lockutils [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.748 226890 DEBUG nova.compute.manager [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] No waiting events found dispatching network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:37 np0005588920 nova_compute[226886]: 2026-01-20 14:54:37.748 226890 WARNING nova.compute.manager [req-2d467a36-0745-442c-aa68-ba7a234fa296 req-2db8f229-36e3-4b20-ad61-976374fb188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received unexpected event network-vif-plugged-54d190c9-33af-46a2-a141-ff83769d93c6 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.032 226890 DEBUG nova.network.neutron [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.056 226890 INFO nova.compute.manager [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Took 1.82 seconds to deallocate network for instance.#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.104 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.105 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:38.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.202 226890 DEBUG oslo_concurrency.processutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3424035681' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.668 226890 DEBUG oslo_concurrency.processutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.675 226890 DEBUG nova.compute.provider_tree [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.692 226890 DEBUG nova.scheduler.client.report [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.723 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.740 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.777 226890 INFO nova.scheduler.client.report [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Deleted allocations for instance 49919d3f-fab0-404f-a0a0-82610973a254#033[00m
Jan 20 09:54:38 np0005588920 nova_compute[226886]: 2026-01-20 14:54:38.886 226890 DEBUG oslo_concurrency.lockutils [None req-46ff694f-7006-47a4-872a-8654467e5b0b 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "49919d3f-fab0-404f-a0a0-82610973a254" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:39.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 20 09:54:39 np0005588920 nova_compute[226886]: 2026-01-20 14:54:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:39 np0005588920 nova_compute[226886]: 2026-01-20 14:54:39.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:54:39 np0005588920 nova_compute[226886]: 2026-01-20 14:54:39.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:40 np0005588920 nova_compute[226886]: 2026-01-20 14:54:40.080 226890 DEBUG nova.compute.manager [req-c65566d0-3ac0-4ec7-bcc8-79c6796c328d req-e823954f-6437-4922-bb5d-693ca5c54b3d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Received event network-vif-deleted-54d190c9-33af-46a2-a141-ff83769d93c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:40 np0005588920 nova_compute[226886]: 2026-01-20 14:54:40.129 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:40 np0005588920 nova_compute[226886]: 2026-01-20 14:54:40.130 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:40 np0005588920 nova_compute[226886]: 2026-01-20 14:54:40.130 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:54:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:40.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.733565) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880733621, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 673, "num_deletes": 255, "total_data_size": 1044116, "memory_usage": 1057296, "flush_reason": "Manual Compaction"}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880743517, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 687669, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49154, "largest_seqno": 49822, "table_properties": {"data_size": 684234, "index_size": 1279, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8344, "raw_average_key_size": 20, "raw_value_size": 677213, "raw_average_value_size": 1631, "num_data_blocks": 55, "num_entries": 415, "num_filter_entries": 415, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920850, "oldest_key_time": 1768920850, "file_creation_time": 1768920880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 10017 microseconds, and 3316 cpu microseconds.
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.743577) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 687669 bytes OK
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.743602) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747508) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747553) EVENT_LOG_v1 {"time_micros": 1768920880747548, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747569) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 1040362, prev total WAL file size 1040362, number of live WAL files 2.
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.748254) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(671KB)], [93(13MB)]
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880748341, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14320502, "oldest_snapshot_seqno": -1}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7486 keys, 12472602 bytes, temperature: kUnknown
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880898021, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 12472602, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12420292, "index_size": 32486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18757, "raw_key_size": 193735, "raw_average_key_size": 25, "raw_value_size": 12284249, "raw_average_value_size": 1640, "num_data_blocks": 1286, "num_entries": 7486, "num_filter_entries": 7486, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.898374) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 12472602 bytes
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.917873) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 95.6 rd, 83.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 13.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(39.0) write-amplify(18.1) OK, records in: 8010, records dropped: 524 output_compression: NoCompression
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.917905) EVENT_LOG_v1 {"time_micros": 1768920880917893, "job": 58, "event": "compaction_finished", "compaction_time_micros": 149874, "compaction_time_cpu_micros": 30286, "output_level": 6, "num_output_files": 1, "total_output_size": 12472602, "num_input_records": 8010, "num_output_records": 7486, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880918173, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880920728, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.748078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.920819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.920825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.920826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.920828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:54:40.920830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:41 np0005588920 nova_compute[226886]: 2026-01-20 14:54:41.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:41.091 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:41.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:41 np0005588920 nova_compute[226886]: 2026-01-20 14:54:41.486 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:41 np0005588920 nova_compute[226886]: 2026-01-20 14:54:41.512 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:41 np0005588920 nova_compute[226886]: 2026-01-20 14:54:41.513 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:54:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:42.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:43.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.747 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:54:43 np0005588920 nova_compute[226886]: 2026-01-20 14:54:43.747 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2381813736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.151 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:44.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.258 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.258 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.261 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.261 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000070 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.424 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.425 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4005MB free_disk=20.78514862060547GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.426 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.426 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.510 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.510 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.510 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.510 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.588 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:44 np0005588920 nova_compute[226886]: 2026-01-20 14:54:44.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2620888822' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:45 np0005588920 nova_compute[226886]: 2026-01-20 14:54:45.011 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:45 np0005588920 nova_compute[226886]: 2026-01-20 14:54:45.017 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:45 np0005588920 nova_compute[226886]: 2026-01-20 14:54:45.035 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:45 np0005588920 nova_compute[226886]: 2026-01-20 14:54:45.057 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:54:45 np0005588920 nova_compute[226886]: 2026-01-20 14:54:45.057 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:45.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:46 np0005588920 nova_compute[226886]: 2026-01-20 14:54:46.036 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:46.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.053 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.054 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.111 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.112 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.112 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:54:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:47.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:47Z|00584|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:54:47 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:47Z|00585|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:54:47 np0005588920 nova_compute[226886]: 2026-01-20 14:54:47.417 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 20 09:54:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:48.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.610 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.610 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.611 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.611 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.611 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.612 226890 INFO nova.compute.manager [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Terminating instance#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.613 226890 DEBUG nova.compute.manager [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:54:48 np0005588920 kernel: tap362a0992-4e (unregistering): left promiscuous mode
Jan 20 09:54:48 np0005588920 NetworkManager[49076]: <info>  [1768920888.6724] device (tap362a0992-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.682 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:48Z|00586|binding|INFO|Releasing lport 362a0992-4e48-4999-a396-29fc2957fa09 from this chassis (sb_readonly=0)
Jan 20 09:54:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:48Z|00587|binding|INFO|Setting lport 362a0992-4e48-4999-a396-29fc2957fa09 down in Southbound
Jan 20 09:54:48 np0005588920 ovn_controller[133971]: 2026-01-20T14:54:48Z|00588|binding|INFO|Removing iface tap362a0992-4e ovn-installed in OVS
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:48.689 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1f:c0 10.100.0.5'], port_security=['fa:16:3e:83:1f:c0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=362a0992-4e48-4999-a396-29fc2957fa09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:48.689 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 362a0992-4e48-4999-a396-29fc2957fa09 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:54:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:48.691 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:54:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:48.693 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0aea42e5-5820-4e94-af31-aa8c0616c494]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:48.694 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.725 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:48 np0005588920 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 20 09:54:48 np0005588920 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000070.scope: Consumed 19.850s CPU time.
Jan 20 09:54:48 np0005588920 systemd-machined[196121]: Machine qemu-53-instance-00000070 terminated.
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.844 226890 INFO nova.virt.libvirt.driver [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Instance destroyed successfully.#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.845 226890 DEBUG nova.objects.instance [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.863 226890 DEBUG nova.virt.libvirt.vif [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1832777585',display_name='tempest-ServerStableDeviceRescueTest-server-1832777585',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1832777585',id=112,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:51:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-hox96xwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:49Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.864 226890 DEBUG nova.network.os_vif_util [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "362a0992-4e48-4999-a396-29fc2957fa09", "address": "fa:16:3e:83:1f:c0", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap362a0992-4e", "ovs_interfaceid": "362a0992-4e48-4999-a396-29fc2957fa09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.865 226890 DEBUG nova.network.os_vif_util [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.865 226890 DEBUG os_vif [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.867 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.867 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap362a0992-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:48 np0005588920 nova_compute[226886]: 2026-01-20 14:54:48.874 226890 INFO os_vif [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:1f:c0,bridge_name='br-int',has_traffic_filtering=True,id=362a0992-4e48-4999-a396-29fc2957fa09,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap362a0992-4e')#033[00m
Jan 20 09:54:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [NOTICE]   (267902) : haproxy version is 2.8.14-c23fe91
Jan 20 09:54:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [NOTICE]   (267902) : path to executable is /usr/sbin/haproxy
Jan 20 09:54:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [WARNING]  (267902) : Exiting Master process...
Jan 20 09:54:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [ALERT]    (267902) : Current worker (267904) exited with code 143 (Terminated)
Jan 20 09:54:48 np0005588920 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[267898]: [WARNING]  (267902) : All workers exited. Exiting... (0)
Jan 20 09:54:48 np0005588920 systemd[1]: libpod-cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67.scope: Deactivated successfully.
Jan 20 09:54:48 np0005588920 podman[271650]: 2026-01-20 14:54:48.886176414 +0000 UTC m=+0.060050713 container died cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:54:48 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67-userdata-shm.mount: Deactivated successfully.
Jan 20 09:54:48 np0005588920 systemd[1]: var-lib-containers-storage-overlay-afbbe09682e2542425445b6878ca6fc4cfb68106ae1f735574f10d3acdca50c9-merged.mount: Deactivated successfully.
Jan 20 09:54:48 np0005588920 podman[271650]: 2026-01-20 14:54:48.925365228 +0000 UTC m=+0.099239527 container cleanup cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:54:48 np0005588920 systemd[1]: libpod-conmon-cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67.scope: Deactivated successfully.
Jan 20 09:54:49 np0005588920 podman[271709]: 2026-01-20 14:54:49.009312193 +0000 UTC m=+0.056972366 container remove cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.015 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5b6c61-f540-4168-bbca-de2b3e293f7e]: (4, ('Tue Jan 20 02:54:48 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67)\ncd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67\nTue Jan 20 02:54:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (cd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67)\ncd81c17c6c59caeea41e6722ce5e4f159c187fc770455fb6f30ae6bf2d86cb67\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.017 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[685ac7b5-329f-4cac-9dc6-af5948ad9d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.017 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:49 np0005588920 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.037 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c73771-4e95-4b31-856c-a7501e96a7aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.055 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d59c2aff-045f-4f56-bd5f-914b2c888d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.057 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2afe58b0-423b-4d71-bdb6-da144bff260b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.065 226890 DEBUG nova.compute.manager [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.066 226890 DEBUG oslo_concurrency.lockutils [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.066 226890 DEBUG oslo_concurrency.lockutils [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.066 226890 DEBUG oslo_concurrency.lockutils [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.067 226890 DEBUG nova.compute.manager [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.067 226890 DEBUG nova.compute.manager [req-4fbc4498-2da8-4059-9bfc-856618a738a4 req-6755a050-0022-4eda-b353-f998c1c44a82 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-unplugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[488024f1-58f0-4b75-bb3c-9a27042a7fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575006, 'reachable_time': 42972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271724, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.075 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:54:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:54:49.075 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[088da75e-a2c4-44cd-bcc6-efca1abde6d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:49.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.328 226890 INFO nova.virt.libvirt.driver [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Deleting instance files /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_del#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.329 226890 INFO nova.virt.libvirt.driver [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Deletion of /var/lib/nova/instances/7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b_del complete#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.392 226890 INFO nova.compute.manager [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.392 226890 DEBUG oslo.service.loopingcall [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.392 226890 DEBUG nova.compute.manager [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.393 226890 DEBUG nova.network.neutron [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:54:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.941 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920874.9391823, 49919d3f-fab0-404f-a0a0-82610973a254 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.941 226890 INFO nova.compute.manager [-] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:54:49 np0005588920 nova_compute[226886]: 2026-01-20 14:54:49.962 226890 DEBUG nova.compute.manager [None req-11c72ed8-45df-487a-9891-6a55cfb43d13 - - - - - -] [instance: 49919d3f-fab0-404f-a0a0-82610973a254] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:50.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.170 226890 DEBUG nova.network.neutron [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.186 226890 INFO nova.compute.manager [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Took 0.79 seconds to deallocate network for instance.#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.244 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.244 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.269 226890 DEBUG nova.compute.manager [req-085bf5b4-4845-40bb-9397-754a369213e2 req-681e370d-6bd1-47ef-9414-735638df1b49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-deleted-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.367 226890 DEBUG oslo_concurrency.processutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1206076325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.805 226890 DEBUG oslo_concurrency.processutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.812 226890 DEBUG nova.compute.provider_tree [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.829 226890 DEBUG nova.scheduler.client.report [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.848 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.874 226890 INFO nova.scheduler.client.report [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Deleted allocations for instance 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b#033[00m
Jan 20 09:54:50 np0005588920 nova_compute[226886]: 2026-01-20 14:54:50.937 226890 DEBUG oslo_concurrency.lockutils [None req-470246a5-eb6c-4728-905c-393336011d05 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.039 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:51.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.327 226890 DEBUG nova.compute.manager [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.328 226890 DEBUG oslo_concurrency.lockutils [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.328 226890 DEBUG oslo_concurrency.lockutils [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.328 226890 DEBUG oslo_concurrency.lockutils [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.329 226890 DEBUG nova.compute.manager [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] No waiting events found dispatching network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:51 np0005588920 nova_compute[226886]: 2026-01-20 14:54:51.329 226890 WARNING nova.compute.manager [req-7a705a00-f74c-44f2-902b-e551d8c0bc6e req-53eb0847-9a23-41a6-b061-6dc6e88c36d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Received unexpected event network-vif-plugged-362a0992-4e48-4999-a396-29fc2957fa09 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:54:52 np0005588920 podman[271748]: 2026-01-20 14:54:52.009584813 +0000 UTC m=+0.093038091 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:54:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:52.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.014 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.015 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.035 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.100 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.100 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.105 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.106 226890 INFO nova.compute.claims [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:54:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.449 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/276321979' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.870 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.874 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.880 226890 DEBUG nova.compute.provider_tree [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.896 226890 DEBUG nova.scheduler.client.report [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.930 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:53 np0005588920 nova_compute[226886]: 2026-01-20 14:54:53.931 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:54:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:54.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.241 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.300 226890 INFO nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.322 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:54:54 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #52. Immutable memtables: 0.
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.422 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.424 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.425 226890 INFO nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Creating image(s)#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.458 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.489 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.518 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.522 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.583 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.584 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.585 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.585 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.614 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.619 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:54 np0005588920 nova_compute[226886]: 2026-01-20 14:54:54.978 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.359s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.043 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] resizing rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.146 226890 DEBUG nova.objects.instance [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lazy-loading 'migration_context' on Instance uuid c8a40dd4-910a-4389-b1ef-48037cf3c09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.175 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.176 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Ensure instance console log exists: /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.176 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.176 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.177 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.178 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.181 226890 WARNING nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.187 226890 DEBUG nova.virt.libvirt.host [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.187 226890 DEBUG nova.virt.libvirt.host [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.191 226890 DEBUG nova.virt.libvirt.host [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.192 226890 DEBUG nova.virt.libvirt.host [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.193 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.194 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.194 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.194 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.194 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.195 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.195 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.195 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.195 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.196 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:54:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.196 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:54:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:55.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.196 226890 DEBUG nova.virt.hardware [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.199 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:55 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2457285435' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.610 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.632 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:55 np0005588920 nova_compute[226886]: 2026-01-20 14:54:55.635 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3520046388' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.042 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.058 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.061 226890 DEBUG nova.objects.instance [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lazy-loading 'pci_devices' on Instance uuid c8a40dd4-910a-4389-b1ef-48037cf3c09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.080 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <uuid>c8a40dd4-910a-4389-b1ef-48037cf3c09d</uuid>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <name>instance-0000007d</name>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersAaction247Test-server-581292439</nova:name>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:54:55</nova:creationTime>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:user uuid="50108de2458b4a0988111e0a0e271937">tempest-ServersAaction247Test-136177538-project-member</nova:user>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <nova:project uuid="45810a5a04b6407f90b37499538f63ba">tempest-ServersAaction247Test-136177538</nova:project>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <nova:ports/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="serial">c8a40dd4-910a-4389-b1ef-48037cf3c09d</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="uuid">c8a40dd4-910a-4389-b1ef-48037cf3c09d</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/console.log" append="off"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:54:56 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:54:56 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:54:56 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:54:56 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.137 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.137 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.138 226890 INFO nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Using config drive#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.162 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:56.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.513 226890 INFO nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Creating config drive at /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.518 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwycbmng execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.651 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwycbmng" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.683 226890 DEBUG nova.storage.rbd_utils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] rbd image c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.689 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.852 226890 DEBUG oslo_concurrency.processutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config c8a40dd4-910a-4389-b1ef-48037cf3c09d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:56 np0005588920 nova_compute[226886]: 2026-01-20 14:54:56.853 226890 INFO nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Deleting local config drive /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d/disk.config because it was imported into RBD.#033[00m
Jan 20 09:54:56 np0005588920 systemd-machined[196121]: New machine qemu-58-instance-0000007d.
Jan 20 09:54:56 np0005588920 systemd[1]: Started Virtual Machine qemu-58-instance-0000007d.
Jan 20 09:54:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:57.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.333 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920897.332855, c8a40dd4-910a-4389-b1ef-48037cf3c09d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.333 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.336 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.336 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.340 226890 INFO nova.virt.libvirt.driver [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance spawned successfully.#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.341 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.366 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.373 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.377 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.377 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.378 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.378 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.379 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.379 226890 DEBUG nova.virt.libvirt.driver [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.416 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.416 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920897.3353097, c8a40dd4-910a-4389-b1ef-48037cf3c09d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.416 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] VM Started (Lifecycle Event)#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.450 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.454 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.459 226890 INFO nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Took 3.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.460 226890 DEBUG nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.471 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.509 226890 INFO nova.compute.manager [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Took 4.42 seconds to build instance.#033[00m
Jan 20 09:54:57 np0005588920 nova_compute[226886]: 2026-01-20 14:54:57.537 226890 DEBUG oslo_concurrency.lockutils [None req-0d806f3f-748b-44d3-bc72-56cae229f7a8 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e301 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:58.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:58 np0005588920 nova_compute[226886]: 2026-01-20 14:54:58.873 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:54:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:59.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.253 226890 DEBUG nova.compute.manager [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.313 226890 INFO nova.compute.manager [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] instance snapshotting#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.314 226890 DEBUG nova.objects.instance [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lazy-loading 'flavor' on Instance uuid c8a40dd4-910a-4389-b1ef-48037cf3c09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.437 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.438 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.438 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.439 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.439 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.440 226890 INFO nova.compute.manager [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Terminating instance#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.441 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "refresh_cache-c8a40dd4-910a-4389-b1ef-48037cf3c09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.441 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquired lock "refresh_cache-c8a40dd4-910a-4389-b1ef-48037cf3c09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.441 226890 DEBUG nova.network.neutron [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.661 226890 DEBUG nova.network.neutron [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:54:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.738 226890 INFO nova.virt.libvirt.driver [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Beginning live snapshot process#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.801 226890 DEBUG nova.compute.manager [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.929 226890 DEBUG nova.network.neutron [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.943 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Releasing lock "refresh_cache-c8a40dd4-910a-4389-b1ef-48037cf3c09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:59 np0005588920 nova_compute[226886]: 2026-01-20 14:54:59.944 226890 DEBUG nova.compute.manager [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:55:00 np0005588920 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 20 09:55:00 np0005588920 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007d.scope: Consumed 3.138s CPU time.
Jan 20 09:55:00 np0005588920 systemd-machined[196121]: Machine qemu-58-instance-0000007d terminated.
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.162 226890 INFO nova.virt.libvirt.driver [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance destroyed successfully.#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.162 226890 DEBUG nova.objects.instance [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lazy-loading 'resources' on Instance uuid c8a40dd4-910a-4389-b1ef-48037cf3c09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:00.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.436 226890 DEBUG nova.compute.manager [None req-ed3a631e-b291-4a54-83f4-35f25a318c0f 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.564 226890 INFO nova.virt.libvirt.driver [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Deleting instance files /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d_del#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.565 226890 INFO nova.virt.libvirt.driver [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Deletion of /var/lib/nova/instances/c8a40dd4-910a-4389-b1ef-48037cf3c09d_del complete#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.634 226890 INFO nova.compute.manager [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.635 226890 DEBUG oslo.service.loopingcall [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.635 226890 DEBUG nova.compute.manager [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.636 226890 DEBUG nova.network.neutron [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.848 226890 DEBUG nova.network.neutron [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.863 226890 DEBUG nova.network.neutron [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.878 226890 INFO nova.compute.manager [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Took 0.24 seconds to deallocate network for instance.#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.914 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:00 np0005588920 nova_compute[226886]: 2026-01-20 14:55:00.915 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.028 226890 DEBUG oslo_concurrency.processutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.058 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:01 np0005588920 podman[272165]: 2026-01-20 14:55:01.177897324 +0000 UTC m=+0.058784727 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:55:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:01.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1360851754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.561 226890 DEBUG oslo_concurrency.processutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.567 226890 DEBUG nova.compute.provider_tree [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.650 226890 DEBUG nova.scheduler.client.report [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.684 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.714 226890 INFO nova.scheduler.client.report [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Deleted allocations for instance c8a40dd4-910a-4389-b1ef-48037cf3c09d#033[00m
Jan 20 09:55:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:01Z|00589|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.797 226890 DEBUG oslo_concurrency.lockutils [None req-58b41c6b-4049-440d-b96e-146b2fac3b27 50108de2458b4a0988111e0a0e271937 45810a5a04b6407f90b37499538f63ba - - default default] Lock "c8a40dd4-910a-4389-b1ef-48037cf3c09d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:01 np0005588920 nova_compute[226886]: 2026-01-20 14:55:01.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:03 np0005588920 nova_compute[226886]: 2026-01-20 14:55:03.843 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920888.842084, 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:03 np0005588920 nova_compute[226886]: 2026-01-20 14:55:03.844 226890 INFO nova.compute.manager [-] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:55:03 np0005588920 nova_compute[226886]: 2026-01-20 14:55:03.875 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:04.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:04 np0005588920 nova_compute[226886]: 2026-01-20 14:55:04.850 226890 DEBUG nova.compute.manager [None req-06480404-fa4b-469c-b4c9-b3dd563b229c - - - - - -] [instance: 7dbd47ad-b9b3-4861-8a8b-f9051fe7a27b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:05.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:06 np0005588920 nova_compute[226886]: 2026-01-20 14:55:06.044 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:06.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:07.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:08.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:08 np0005588920 nova_compute[226886]: 2026-01-20 14:55:08.878 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:09.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:10.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:11 np0005588920 nova_compute[226886]: 2026-01-20 14:55:11.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:11.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:12.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:13.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:55:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2253935145' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:55:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:55:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2253935145' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:55:13 np0005588920 nova_compute[226886]: 2026-01-20 14:55:13.882 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:14.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:15 np0005588920 nova_compute[226886]: 2026-01-20 14:55:15.160 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920900.1601586, c8a40dd4-910a-4389-b1ef-48037cf3c09d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:15 np0005588920 nova_compute[226886]: 2026-01-20 14:55:15.161 226890 INFO nova.compute.manager [-] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:55:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:15.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:15 np0005588920 nova_compute[226886]: 2026-01-20 14:55:15.409 226890 DEBUG nova.compute.manager [None req-9d8384c3-115c-49c0-a4ef-9df8213d0805 - - - - - -] [instance: c8a40dd4-910a-4389-b1ef-48037cf3c09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:15 np0005588920 nova_compute[226886]: 2026-01-20 14:55:15.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:16 np0005588920 nova_compute[226886]: 2026-01-20 14:55:16.048 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:16.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:16.457 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/784021205' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:17.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:18.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:55:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:55:18 np0005588920 nova_compute[226886]: 2026-01-20 14:55:18.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:19.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:20.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:20 np0005588920 nova_compute[226886]: 2026-01-20 14:55:20.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588920 nova_compute[226886]: 2026-01-20 14:55:21.050 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588920 nova_compute[226886]: 2026-01-20 14:55:21.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:21.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:22.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:22 np0005588920 podman[272338]: 2026-01-20 14:55:22.986094539 +0000 UTC m=+0.076830456 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:55:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 20 09:55:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:23.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:23 np0005588920 nova_compute[226886]: 2026-01-20 14:55:23.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 20 09:55:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:24.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.458 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.458 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.510 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.668 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.669 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.676 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.676 226890 INFO nova.compute.claims [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:55:24 np0005588920 nova_compute[226886]: 2026-01-20 14:55:24.894 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:25.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2689986105' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.330 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.336 226890 DEBUG nova.compute.provider_tree [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.454 226890 DEBUG nova.scheduler.client.report [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.485 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.486 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:55:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.575 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.576 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.606 226890 INFO nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:55:25 np0005588920 nova_compute[226886]: 2026-01-20 14:55:25.895 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.016 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.017 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.017 226890 INFO nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Creating image(s)#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.045 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.075 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.103 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.108 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.175 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.176 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.176 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.177 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.201 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.204 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:26.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.234 226890 DEBUG nova.policy [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '87bcc22682984b40b43e0246ea142695', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec388b65a7fc480f99d0ceb5451725ea', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.700 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.781 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] resizing rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:55:26 np0005588920 nova_compute[226886]: 2026-01-20 14:55:26.907 226890 DEBUG nova.objects.instance [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lazy-loading 'migration_context' on Instance uuid 68e2c62d-7883-4f68-a2c6-da2265b01c93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.093 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.093 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Ensure instance console log exists: /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.094 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.094 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.094 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:27 np0005588920 nova_compute[226886]: 2026-01-20 14:55:27.415 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Successfully created port: c62c62c5-b0d3-4c19-bbf8-453f86405984 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:55:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.704 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Successfully updated port: c62c62c5-b0d3-4c19-bbf8-453f86405984 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.729 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.729 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquired lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.729 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.869 226890 DEBUG nova.compute.manager [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-changed-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.870 226890 DEBUG nova.compute.manager [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Refreshing instance network info cache due to event network-changed-c62c62c5-b0d3-4c19-bbf8-453f86405984. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.870 226890 DEBUG oslo_concurrency.lockutils [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:28 np0005588920 nova_compute[226886]: 2026-01-20 14:55:28.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:29 np0005588920 nova_compute[226886]: 2026-01-20 14:55:29.236 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:55:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:29.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:30.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.539 226890 DEBUG nova.network.neutron [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Updating instance_info_cache with network_info: [{"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.565 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Releasing lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.565 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Instance network_info: |[{"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.566 226890 DEBUG oslo_concurrency.lockutils [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.566 226890 DEBUG nova.network.neutron [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Refreshing network info cache for port c62c62c5-b0d3-4c19-bbf8-453f86405984 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.569 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Start _get_guest_xml network_info=[{"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.572 226890 WARNING nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.579 226890 DEBUG nova.virt.libvirt.host [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.580 226890 DEBUG nova.virt.libvirt.host [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.585 226890 DEBUG nova.virt.libvirt.host [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.585 226890 DEBUG nova.virt.libvirt.host [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.586 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.586 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.587 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.588 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.588 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.588 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.588 226890 DEBUG nova.virt.hardware [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:55:30 np0005588920 nova_compute[226886]: 2026-01-20 14:55:30.591 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/683276483' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.104 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.128 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.132 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:31.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3071191747' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.601 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.602 226890 DEBUG nova.virt.libvirt.vif [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1059424819',display_name='tempest-ServerAddressesNegativeTestJSON-server-1059424819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1059424819',id=126,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec388b65a7fc480f99d0ceb5451725ea',ramdisk_id='',reservation_id='r-2f29ynea',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1149598493',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1149598493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:25Z,user_data=None,user_id='87bcc22682984b40b43e0246ea142695',uuid=68e2c62d-7883-4f68-a2c6-da2265b01c93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.602 226890 DEBUG nova.network.os_vif_util [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converting VIF {"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.603 226890 DEBUG nova.network.os_vif_util [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.604 226890 DEBUG nova.objects.instance [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lazy-loading 'pci_devices' on Instance uuid 68e2c62d-7883-4f68-a2c6-da2265b01c93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.630 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <uuid>68e2c62d-7883-4f68-a2c6-da2265b01c93</uuid>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <name>instance-0000007e</name>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1059424819</nova:name>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:55:30</nova:creationTime>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:user uuid="87bcc22682984b40b43e0246ea142695">tempest-ServerAddressesNegativeTestJSON-1149598493-project-member</nova:user>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:project uuid="ec388b65a7fc480f99d0ceb5451725ea">tempest-ServerAddressesNegativeTestJSON-1149598493</nova:project>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <nova:port uuid="c62c62c5-b0d3-4c19-bbf8-453f86405984">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="serial">68e2c62d-7883-4f68-a2c6-da2265b01c93</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="uuid">68e2c62d-7883-4f68-a2c6-da2265b01c93</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/68e2c62d-7883-4f68-a2c6-da2265b01c93_disk">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1a:c8:4a"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <target dev="tapc62c62c5-b0"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/console.log" append="off"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:55:31 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:55:31 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:55:31 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:55:31 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.630 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Preparing to wait for external event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.630 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.631 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.631 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.631 226890 DEBUG nova.virt.libvirt.vif [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1059424819',display_name='tempest-ServerAddressesNegativeTestJSON-server-1059424819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1059424819',id=126,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec388b65a7fc480f99d0ceb5451725ea',ramdisk_id='',reservation_id='r-2f29ynea',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1149598493',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1149598493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:25Z,user_data=None,user_id='87bcc22682984b40b43e0246ea142695',uuid=68e2c62d-7883-4f68-a2c6-da2265b01c93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.632 226890 DEBUG nova.network.os_vif_util [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converting VIF {"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.632 226890 DEBUG nova.network.os_vif_util [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.632 226890 DEBUG os_vif [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.633 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.634 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.636 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.636 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc62c62c5-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.637 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc62c62c5-b0, col_values=(('external_ids', {'iface-id': 'c62c62c5-b0d3-4c19-bbf8-453f86405984', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:c8:4a', 'vm-uuid': '68e2c62d-7883-4f68-a2c6-da2265b01c93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588920 NetworkManager[49076]: <info>  [1768920931.6395] manager: (tapc62c62c5-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.649 226890 INFO os_vif [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0')#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.711 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.712 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.712 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] No VIF found with MAC fa:16:3e:1a:c8:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.713 226890 INFO nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Using config drive#033[00m
Jan 20 09:55:31 np0005588920 podman[272667]: 2026-01-20 14:55:31.747308372 +0000 UTC m=+0.062787119 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Jan 20 09:55:31 np0005588920 nova_compute[226886]: 2026-01-20 14:55:31.753 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.182 226890 INFO nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Creating config drive at /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.186 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nb0t4b9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:32.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.315 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nb0t4b9" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.340 226890 DEBUG nova.storage.rbd_utils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] rbd image 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.344 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.506 226890 DEBUG oslo_concurrency.processutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config 68e2c62d-7883-4f68-a2c6-da2265b01c93_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.507 226890 INFO nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Deleting local config drive /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93/disk.config because it was imported into RBD.#033[00m
Jan 20 09:55:32 np0005588920 kernel: tapc62c62c5-b0: entered promiscuous mode
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.5749] manager: (tapc62c62c5-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.574 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:32Z|00590|binding|INFO|Claiming lport c62c62c5-b0d3-4c19-bbf8-453f86405984 for this chassis.
Jan 20 09:55:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:32Z|00591|binding|INFO|c62c62c5-b0d3-4c19-bbf8-453f86405984: Claiming fa:16:3e:1a:c8:4a 10.100.0.3
Jan 20 09:55:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:32Z|00592|binding|INFO|Setting lport c62c62c5-b0d3-4c19-bbf8-453f86405984 ovn-installed in OVS
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.593 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.597 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:32 np0005588920 systemd-udevd[272757]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:55:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:32Z|00593|binding|INFO|Setting lport c62c62c5-b0d3-4c19-bbf8-453f86405984 up in Southbound
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.608 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:c8:4a 10.100.0.3'], port_security=['fa:16:3e:1a:c8:4a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '68e2c62d-7883-4f68-a2c6-da2265b01c93', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec388b65a7fc480f99d0ceb5451725ea', 'neutron:revision_number': '2', 'neutron:security_group_ids': '846c21f4-5f7f-485f-9452-467159a0ce42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34521a81-102e-4b98-93a9-758f2637a83e, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c62c62c5-b0d3-4c19-bbf8-453f86405984) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.610 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c62c62c5-b0d3-4c19-bbf8-453f86405984 in datapath e572bd57-633e-4abc-ba06-33f2d3fe513c bound to our chassis#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.612 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e572bd57-633e-4abc-ba06-33f2d3fe513c#033[00m
Jan 20 09:55:32 np0005588920 systemd-machined[196121]: New machine qemu-59-instance-0000007e.
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.6214] device (tapc62c62c5-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.6221] device (tapc62c62c5-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.622 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f17286-a16f-4736-8925-3695af93fce7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.623 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape572bd57-61 in ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.625 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape572bd57-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.625 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b74cb66d-35fc-4603-be83-b59b00112d68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.626 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e024efdc-8178-4a34-92c3-a88fdb23e6ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 systemd[1]: Started Virtual Machine qemu-59-instance-0000007e.
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.642 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[ef77d2f2-cbc6-4a88-910d-92004920283a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.666 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b384af98-0601-4167-85de-5bca42cdb601]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.697 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[33bfa84e-f37c-48f0-b1ce-b591dcd0fee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 systemd-udevd[272761]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.7032] manager: (tape572bd57-60): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.702 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[304cec0c-af03-4447-baea-da89c3189665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.739 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[134726bb-2143-4add-b41b-341c1c45ca45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.742 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[73902e18-1743-480d-840e-6ee49084f0a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.7699] device (tape572bd57-60): carrier: link connected
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.780 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5384f7-acd5-4a8e-baa7-a30124f19cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.803 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8db83b-d7b7-48ee-984b-33ca444a7fb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape572bd57-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:68:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597445, 'reachable_time': 29142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272791, 'error': None, 'target': 'ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.827 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a46522d-7e0d-4d73-8b76-146203e5ec7f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:6862'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 597445, 'tstamp': 597445}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272792, 'error': None, 'target': 'ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.853 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc20c4b-8e92-40b7-a2c8-43cbd6793f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape572bd57-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:68:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597445, 'reachable_time': 29142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272793, 'error': None, 'target': 'ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.903 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f0aa0d5a-5f2c-4a1f-9975-e37b2468803f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.969 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0694c9fd-0bf3-4bad-ae03-4281a9ea73f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.971 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape572bd57-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.971 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.972 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape572bd57-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:32 np0005588920 NetworkManager[49076]: <info>  [1768920932.9745] manager: (tape572bd57-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 20 09:55:32 np0005588920 kernel: tape572bd57-60: entered promiscuous mode
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.978 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape572bd57-60, col_values=(('external_ids', {'iface-id': '4b364928-500a-4da7-8ba1-4f8606ec4c64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:32 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:32Z|00594|binding|INFO|Releasing lport 4b364928-500a-4da7-8ba1-4f8606ec4c64 from this chassis (sb_readonly=0)
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.982 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e572bd57-633e-4abc-ba06-33f2d3fe513c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e572bd57-633e-4abc-ba06-33f2d3fe513c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.983 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3fea77-ba85-463b-b1dd-17657082c3e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.983 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-e572bd57-633e-4abc-ba06-33f2d3fe513c
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/e572bd57-633e-4abc-ba06-33f2d3fe513c.pid.haproxy
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID e572bd57-633e-4abc-ba06-33f2d3fe513c
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:55:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:32.984 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'env', 'PROCESS_TAG=haproxy-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e572bd57-633e-4abc-ba06-33f2d3fe513c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:55:32 np0005588920 nova_compute[226886]: 2026-01-20 14:55:32.994 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.124 226890 DEBUG nova.network.neutron [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Updated VIF entry in instance network info cache for port c62c62c5-b0d3-4c19-bbf8-453f86405984. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.124 226890 DEBUG nova.network.neutron [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Updating instance_info_cache with network_info: [{"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.147 226890 DEBUG oslo_concurrency.lockutils [req-ff2192f3-99c3-49b3-896b-6da906b4a523 req-37759d9e-8a90-4159-a1ea-6100d848cecd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-68e2c62d-7883-4f68-a2c6-da2265b01c93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:33.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:33 np0005588920 podman[272841]: 2026-01-20 14:55:33.305749589 +0000 UTC m=+0.020009704 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.843 226890 DEBUG nova.compute.manager [req-04e6e6f4-4d3d-4ea7-9577-9519a7cf1204 req-87ea0d8f-be96-4c9d-95a3-46159d72657c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.843 226890 DEBUG oslo_concurrency.lockutils [req-04e6e6f4-4d3d-4ea7-9577-9519a7cf1204 req-87ea0d8f-be96-4c9d-95a3-46159d72657c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.844 226890 DEBUG oslo_concurrency.lockutils [req-04e6e6f4-4d3d-4ea7-9577-9519a7cf1204 req-87ea0d8f-be96-4c9d-95a3-46159d72657c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.844 226890 DEBUG oslo_concurrency.lockutils [req-04e6e6f4-4d3d-4ea7-9577-9519a7cf1204 req-87ea0d8f-be96-4c9d-95a3-46159d72657c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:33 np0005588920 nova_compute[226886]: 2026-01-20 14:55:33.844 226890 DEBUG nova.compute.manager [req-04e6e6f4-4d3d-4ea7-9577-9519a7cf1204 req-87ea0d8f-be96-4c9d-95a3-46159d72657c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Processing event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:55:33 np0005588920 podman[272841]: 2026-01-20 14:55:33.860749173 +0000 UTC m=+0.575009258 container create c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:55:33 np0005588920 systemd[1]: Started libpod-conmon-c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7.scope.
Jan 20 09:55:33 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:55:33 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6dcc0a5130390589ec8e9e7b82f8f9bc0b03deca349b16d6d4501365dd6ff61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:55:33 np0005588920 podman[272841]: 2026-01-20 14:55:33.96043428 +0000 UTC m=+0.674694415 container init c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:55:33 np0005588920 podman[272841]: 2026-01-20 14:55:33.968426376 +0000 UTC m=+0.682686481 container start c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:55:33 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [NOTICE]   (272879) : New worker (272881) forked
Jan 20 09:55:33 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [NOTICE]   (272879) : Loading success.
Jan 20 09:55:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:34.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.305 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.307 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920934.306171, 68e2c62d-7883-4f68-a2c6-da2265b01c93 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.307 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] VM Started (Lifecycle Event)#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.313 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.317 226890 INFO nova.virt.libvirt.driver [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Instance spawned successfully.#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.317 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.355 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.362 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.363 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.363 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.364 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.364 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.365 226890 DEBUG nova.virt.libvirt.driver [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.369 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.400 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.400 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920934.3063347, 68e2c62d-7883-4f68-a2c6-da2265b01c93 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.401 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.421 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.425 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920934.313123, 68e2c62d-7883-4f68-a2c6-da2265b01c93 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.426 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.430 226890 INFO nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Took 8.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.430 226890 DEBUG nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.455 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.458 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.477 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.499 226890 INFO nova.compute.manager [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Took 9.86 seconds to build instance.#033[00m
Jan 20 09:55:34 np0005588920 nova_compute[226886]: 2026-01-20 14:55:34.583 226890 DEBUG oslo_concurrency.lockutils [None req-8cb132d4-88be-41ed-9dea-a15766cf7ffd 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.192 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.193 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.193 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.193 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.194 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.194 226890 INFO nova.compute.manager [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Terminating instance#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.195 226890 DEBUG nova.compute.manager [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:55:35 np0005588920 kernel: tapc62c62c5-b0 (unregistering): left promiscuous mode
Jan 20 09:55:35 np0005588920 NetworkManager[49076]: <info>  [1768920935.2368] device (tapc62c62c5-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.240 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:35Z|00595|binding|INFO|Releasing lport c62c62c5-b0d3-4c19-bbf8-453f86405984 from this chassis (sb_readonly=0)
Jan 20 09:55:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:35Z|00596|binding|INFO|Setting lport c62c62c5-b0d3-4c19-bbf8-453f86405984 down in Southbound
Jan 20 09:55:35 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:35Z|00597|binding|INFO|Removing iface tapc62c62c5-b0 ovn-installed in OVS
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.242 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.247 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:c8:4a 10.100.0.3'], port_security=['fa:16:3e:1a:c8:4a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '68e2c62d-7883-4f68-a2c6-da2265b01c93', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec388b65a7fc480f99d0ceb5451725ea', 'neutron:revision_number': '4', 'neutron:security_group_ids': '846c21f4-5f7f-485f-9452-467159a0ce42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34521a81-102e-4b98-93a9-758f2637a83e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c62c62c5-b0d3-4c19-bbf8-453f86405984) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.248 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c62c62c5-b0d3-4c19-bbf8-453f86405984 in datapath e572bd57-633e-4abc-ba06-33f2d3fe513c unbound from our chassis#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.250 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e572bd57-633e-4abc-ba06-33f2d3fe513c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.251 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1d336397-79b2-40c2-a2f6-b28a5f4595d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.252 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c namespace which is not needed anymore#033[00m
Jan 20 09:55:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:35.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Jan 20 09:55:35 np0005588920 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007e.scope: Consumed 1.547s CPU time.
Jan 20 09:55:35 np0005588920 systemd-machined[196121]: Machine qemu-59-instance-0000007e terminated.
Jan 20 09:55:35 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [NOTICE]   (272879) : haproxy version is 2.8.14-c23fe91
Jan 20 09:55:35 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [NOTICE]   (272879) : path to executable is /usr/sbin/haproxy
Jan 20 09:55:35 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [WARNING]  (272879) : Exiting Master process...
Jan 20 09:55:35 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [ALERT]    (272879) : Current worker (272881) exited with code 143 (Terminated)
Jan 20 09:55:35 np0005588920 neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c[272875]: [WARNING]  (272879) : All workers exited. Exiting... (0)
Jan 20 09:55:35 np0005588920 systemd[1]: libpod-c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7.scope: Deactivated successfully.
Jan 20 09:55:35 np0005588920 conmon[272875]: conmon c920961fa2df697e7a3c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7.scope/container/memory.events
Jan 20 09:55:35 np0005588920 podman[272918]: 2026-01-20 14:55:35.375770187 +0000 UTC m=+0.041536591 container died c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:55:35 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7-userdata-shm.mount: Deactivated successfully.
Jan 20 09:55:35 np0005588920 systemd[1]: var-lib-containers-storage-overlay-b6dcc0a5130390589ec8e9e7b82f8f9bc0b03deca349b16d6d4501365dd6ff61-merged.mount: Deactivated successfully.
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.415 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 podman[272918]: 2026-01-20 14:55:35.416790313 +0000 UTC m=+0.082556707 container cleanup c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.420 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 systemd[1]: libpod-conmon-c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7.scope: Deactivated successfully.
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.433 226890 INFO nova.virt.libvirt.driver [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Instance destroyed successfully.#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.433 226890 DEBUG nova.objects.instance [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lazy-loading 'resources' on Instance uuid 68e2c62d-7883-4f68-a2c6-da2265b01c93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.449 226890 DEBUG nova.virt.libvirt.vif [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:55:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1059424819',display_name='tempest-ServerAddressesNegativeTestJSON-server-1059424819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1059424819',id=126,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:55:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec388b65a7fc480f99d0ceb5451725ea',ramdisk_id='',reservation_id='r-2f29ynea',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1149598493',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1149598493-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:55:34Z,user_data=None,user_id='87bcc22682984b40b43e0246ea142695',uuid=68e2c62d-7883-4f68-a2c6-da2265b01c93,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.450 226890 DEBUG nova.network.os_vif_util [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converting VIF {"id": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "address": "fa:16:3e:1a:c8:4a", "network": {"id": "e572bd57-633e-4abc-ba06-33f2d3fe513c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-412148297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec388b65a7fc480f99d0ceb5451725ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc62c62c5-b0", "ovs_interfaceid": "c62c62c5-b0d3-4c19-bbf8-453f86405984", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.451 226890 DEBUG nova.network.os_vif_util [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.451 226890 DEBUG os_vif [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.454 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc62c62c5-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.455 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.458 226890 INFO os_vif [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:c8:4a,bridge_name='br-int',has_traffic_filtering=True,id=c62c62c5-b0d3-4c19-bbf8-453f86405984,network=Network(e572bd57-633e-4abc-ba06-33f2d3fe513c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc62c62c5-b0')#033[00m
Jan 20 09:55:35 np0005588920 podman[272956]: 2026-01-20 14:55:35.478221973 +0000 UTC m=+0.040097020 container remove c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.483 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[34c39caf-45a1-4cbe-a536-15fd1d104569]: (4, ('Tue Jan 20 02:55:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c (c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7)\nc920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7\nTue Jan 20 02:55:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c (c920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7)\nc920961fa2df697e7a3c52f6efddaccc74c1b5f8e2c989d5c4f383084601a2f7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.484 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1ca5d9-5f46-4831-9247-6d1a3d0b5f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.485 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape572bd57-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.487 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 kernel: tape572bd57-60: left promiscuous mode
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.501 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.503 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[078fef0f-3d99-4410-8804-ec6569e6b1cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.515 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e29cd88b-52fd-4844-8797-f973e674e8b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.517 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e9d8ea-8d63-4503-b36a-5f84fd28aa8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.530 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[92711598-7a9a-4fb5-a0f3-4153cbb62b60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597437, 'reachable_time': 18777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272993, 'error': None, 'target': 'ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 systemd[1]: run-netns-ovnmeta\x2de572bd57\x2d633e\x2d4abc\x2dba06\x2d33f2d3fe513c.mount: Deactivated successfully.
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.533 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e572bd57-633e-4abc-ba06-33f2d3fe513c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:55:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:35.533 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[95638d7f-43d1-4f83-9f25-5573a30be3cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.791 226890 DEBUG nova.compute.manager [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.792 226890 DEBUG oslo_concurrency.lockutils [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.792 226890 DEBUG oslo_concurrency.lockutils [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.792 226890 DEBUG oslo_concurrency.lockutils [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.792 226890 DEBUG nova.compute.manager [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] No waiting events found dispatching network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.793 226890 WARNING nova.compute.manager [req-166a64bc-7356-43d7-869b-7b5fcc5eeda9 req-ef2f8b41-032e-4064-a6a2-375b7257c4a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received unexpected event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.817 226890 INFO nova.virt.libvirt.driver [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Deleting instance files /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93_del#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.818 226890 INFO nova.virt.libvirt.driver [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Deletion of /var/lib/nova/instances/68e2c62d-7883-4f68-a2c6-da2265b01c93_del complete#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.879 226890 INFO nova.compute.manager [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.880 226890 DEBUG oslo.service.loopingcall [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.880 226890 DEBUG nova.compute.manager [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:55:35 np0005588920 nova_compute[226886]: 2026-01-20 14:55:35.881 226890 DEBUG nova.network.neutron [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:55:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:36.025 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:36.026 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:55:36 np0005588920 nova_compute[226886]: 2026-01-20 14:55:36.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:36 np0005588920 nova_compute[226886]: 2026-01-20 14:55:36.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:36.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.199 226890 DEBUG nova.network.neutron [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:37.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.307 226890 INFO nova.compute.manager [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Took 1.43 seconds to deallocate network for instance.#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.348 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.349 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.435 226890 DEBUG oslo_concurrency.processutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.700 226890 DEBUG nova.compute.manager [req-c95d534d-3b65-4a6f-a7ce-f1f8bb2556de req-ffac4040-a82a-43b6-af16-af5cc83bf6dc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-vif-deleted-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2887191558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.879 226890 DEBUG nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-vif-unplugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.880 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.880 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.881 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.881 226890 DEBUG nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] No waiting events found dispatching network-vif-unplugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.881 226890 WARNING nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received unexpected event network-vif-unplugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.882 226890 DEBUG nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.882 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.882 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.882 226890 DEBUG oslo_concurrency.lockutils [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.883 226890 DEBUG nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] No waiting events found dispatching network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.883 226890 WARNING nova.compute.manager [req-731228e8-80f0-4723-a2b7-7192e2df5d35 req-853b2a95-79aa-4b59-bb8a-7a0b4bded01f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Received unexpected event network-vif-plugged-c62c62c5-b0d3-4c19-bbf8-453f86405984 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.890 226890 DEBUG oslo_concurrency.processutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.895 226890 DEBUG nova.compute.provider_tree [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.909 226890 DEBUG nova.scheduler.client.report [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.955 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:37 np0005588920 nova_compute[226886]: 2026-01-20 14:55:37.994 226890 INFO nova.scheduler.client.report [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Deleted allocations for instance 68e2c62d-7883-4f68-a2c6-da2265b01c93#033[00m
Jan 20 09:55:38 np0005588920 nova_compute[226886]: 2026-01-20 14:55:38.049 226890 DEBUG oslo_concurrency.lockutils [None req-580ce056-84e3-485f-9e25-329ec825933b 87bcc22682984b40b43e0246ea142695 ec388b65a7fc480f99d0ceb5451725ea - - default default] Lock "68e2c62d-7883-4f68-a2c6-da2265b01c93" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:38.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:38 np0005588920 nova_compute[226886]: 2026-01-20 14:55:38.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:39.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:40.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:40 np0005588920 nova_compute[226886]: 2026-01-20 14:55:40.456 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.096 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:41.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.915 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.915 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.915 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:55:41 np0005588920 nova_compute[226886]: 2026-01-20 14:55:41.915 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce0152a6-7d4d-4eac-9587-a43ad934d9cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:42.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:43.027 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:43.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:43Z|00598|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:55:43 np0005588920 nova_compute[226886]: 2026-01-20 14:55:43.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.090 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.106 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.107 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:55:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:44.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.744 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.744 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.744 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.744 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:55:44 np0005588920 nova_compute[226886]: 2026-01-20 14:55:44.745 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3485196193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.218 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.372 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.372 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.458 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.526 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.527 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4120MB free_disk=20.87616729736328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.527 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.528 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.628 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.629 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.629 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:55:45 np0005588920 nova_compute[226886]: 2026-01-20 14:55:45.669 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4122561463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.086 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.092 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.115 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.157 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:55:46 np0005588920 nova_compute[226886]: 2026-01-20 14:55:46.157 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:46.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:47 np0005588920 nova_compute[226886]: 2026-01-20 14:55:47.153 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:47 np0005588920 nova_compute[226886]: 2026-01-20 14:55:47.154 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:47.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:48.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3137524621' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.651 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.651 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.681 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.770 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.770 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.775 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.776 226890 INFO nova.compute.claims [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:55:48 np0005588920 nova_compute[226886]: 2026-01-20 14:55:48.875 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:49.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/858933901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.311 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.317 226890 DEBUG nova.compute.provider_tree [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.330 226890 DEBUG nova.scheduler.client.report [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.350 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.350 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.403 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.404 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.420 226890 INFO nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.440 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.519 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.521 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.521 226890 INFO nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Creating image(s)#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.561 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.590 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.618 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.622 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.649 226890 DEBUG nova.policy [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '395a5c503218411284bc94c45263d1fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.690 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.691 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.691 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.692 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.718 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.721 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:49 np0005588920 nova_compute[226886]: 2026-01-20 14:55:49.745 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:50.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.431 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920935.4305096, 68e2c62d-7883-4f68-a2c6-da2265b01c93 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.432 226890 INFO nova.compute.manager [-] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.459 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.738s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.487 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.526 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] resizing rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.631 226890 DEBUG nova.compute.manager [None req-3215e4e8-67ba-414a-ad5b-7767644443ec - - - - - -] [instance: 68e2c62d-7883-4f68-a2c6-da2265b01c93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.636 226890 DEBUG nova.objects.instance [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'migration_context' on Instance uuid 3339fe13-fddd-4233-9eac-bb4dbce1c777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.683 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.684 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Ensure instance console log exists: /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.684 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.684 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.684 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:50 np0005588920 nova_compute[226886]: 2026-01-20 14:55:50.798 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Successfully created port: 852888ae-cf7b-4cbe-b96c-3b75b073d386 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:55:51 np0005588920 nova_compute[226886]: 2026-01-20 14:55:51.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:51.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:51 np0005588920 nova_compute[226886]: 2026-01-20 14:55:51.905 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Successfully updated port: 852888ae-cf7b-4cbe-b96c-3b75b073d386 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:55:51 np0005588920 nova_compute[226886]: 2026-01-20 14:55:51.926 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:51 np0005588920 nova_compute[226886]: 2026-01-20 14:55:51.927 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquired lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:51 np0005588920 nova_compute[226886]: 2026-01-20 14:55:51.927 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:55:52 np0005588920 nova_compute[226886]: 2026-01-20 14:55:52.109 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:55:52 np0005588920 nova_compute[226886]: 2026-01-20 14:55:52.129 226890 DEBUG nova.compute.manager [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-changed-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:52 np0005588920 nova_compute[226886]: 2026-01-20 14:55:52.129 226890 DEBUG nova.compute.manager [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Refreshing instance network info cache due to event network-changed-852888ae-cf7b-4cbe-b96c-3b75b073d386. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:55:52 np0005588920 nova_compute[226886]: 2026-01-20 14:55:52.130 226890 DEBUG oslo_concurrency.lockutils [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:52.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:53.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.504 226890 DEBUG nova.network.neutron [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Updating instance_info_cache with network_info: [{"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.740 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Releasing lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.740 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Instance network_info: |[{"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.740 226890 DEBUG oslo_concurrency.lockutils [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.741 226890 DEBUG nova.network.neutron [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Refreshing network info cache for port 852888ae-cf7b-4cbe-b96c-3b75b073d386 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.743 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Start _get_guest_xml network_info=[{"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.747 226890 WARNING nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.753 226890 DEBUG nova.virt.libvirt.host [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.754 226890 DEBUG nova.virt.libvirt.host [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.759 226890 DEBUG nova.virt.libvirt.host [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.760 226890 DEBUG nova.virt.libvirt.host [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.761 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.762 226890 DEBUG nova.virt.hardware [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:55:53 np0005588920 nova_compute[226886]: 2026-01-20 14:55:53.765 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:53 np0005588920 podman[273270]: 2026-01-20 14:55:53.990282939 +0000 UTC m=+0.076146026 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2206619567' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.196 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.223 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.230 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:54.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/251820846' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.724 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.726 226890 DEBUG nova.virt.libvirt.vif [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-343728074',display_name='tempest-ServersTestJSON-server-343728074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-343728074',id=128,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-act430l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:49Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3339fe13-fddd-4233-9eac-bb4dbce1c777,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.726 226890 DEBUG nova.network.os_vif_util [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.727 226890 DEBUG nova.network.os_vif_util [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.729 226890 DEBUG nova.objects.instance [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3339fe13-fddd-4233-9eac-bb4dbce1c777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.742 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <uuid>3339fe13-fddd-4233-9eac-bb4dbce1c777</uuid>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <name>instance-00000080</name>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersTestJSON-server-343728074</nova:name>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:55:53</nova:creationTime>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:user uuid="395a5c503218411284bc94c45263d1fb">tempest-ServersTestJSON-405461620-project-member</nova:user>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:project uuid="ca6cd0afe0ab41e3ab36d21a4129f734">tempest-ServersTestJSON-405461620</nova:project>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <nova:port uuid="852888ae-cf7b-4cbe-b96c-3b75b073d386">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="serial">3339fe13-fddd-4233-9eac-bb4dbce1c777</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="uuid">3339fe13-fddd-4233-9eac-bb4dbce1c777</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/3339fe13-fddd-4233-9eac-bb4dbce1c777_disk">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:fe:3f:5f"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <target dev="tap852888ae-cf"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/console.log" append="off"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:55:54 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:55:54 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:55:54 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:55:54 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.743 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Preparing to wait for external event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.744 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.744 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.745 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.746 226890 DEBUG nova.virt.libvirt.vif [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-343728074',display_name='tempest-ServersTestJSON-server-343728074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-343728074',id=128,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-act430l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:49Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3339fe13-fddd-4233-9eac-bb4dbce1c777,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.746 226890 DEBUG nova.network.os_vif_util [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.746 226890 DEBUG nova.network.os_vif_util [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.747 226890 DEBUG os_vif [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.748 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.748 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.749 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.752 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.752 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap852888ae-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.753 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap852888ae-cf, col_values=(('external_ids', {'iface-id': '852888ae-cf7b-4cbe-b96c-3b75b073d386', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:3f:5f', 'vm-uuid': '3339fe13-fddd-4233-9eac-bb4dbce1c777'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.755 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:54 np0005588920 NetworkManager[49076]: <info>  [1768920954.7558] manager: (tap852888ae-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.757 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.760 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.761 226890 INFO os_vif [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf')#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.806 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.806 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.807 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No VIF found with MAC fa:16:3e:fe:3f:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.808 226890 INFO nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Using config drive#033[00m
Jan 20 09:55:54 np0005588920 nova_compute[226886]: 2026-01-20 14:55:54.839 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.935895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954935921, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 1180, "num_deletes": 254, "total_data_size": 2204638, "memory_usage": 2254584, "flush_reason": "Manual Compaction"}
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954943944, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 977776, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49827, "largest_seqno": 51002, "table_properties": {"data_size": 973325, "index_size": 1911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12214, "raw_average_key_size": 21, "raw_value_size": 963598, "raw_average_value_size": 1702, "num_data_blocks": 83, "num_entries": 566, "num_filter_entries": 566, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920881, "oldest_key_time": 1768920881, "file_creation_time": 1768920954, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 8108 microseconds, and 2926 cpu microseconds.
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.944000) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 977776 bytes OK
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.944020) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.945913) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.945936) EVENT_LOG_v1 {"time_micros": 1768920954945929, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.945958) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 2198836, prev total WAL file size 2198836, number of live WAL files 2.
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.947477) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353035' seq:72057594037927935, type:22 .. '6D6772737461740031373536' seq:0, type:0; will stop at (end)
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(954KB)], [96(11MB)]
Jan 20 09:55:54 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954947512, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 13450378, "oldest_snapshot_seqno": -1}
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 7559 keys, 10085785 bytes, temperature: kUnknown
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955056184, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 10085785, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10036562, "index_size": 29207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18949, "raw_key_size": 195704, "raw_average_key_size": 25, "raw_value_size": 9902856, "raw_average_value_size": 1310, "num_data_blocks": 1149, "num_entries": 7559, "num_filter_entries": 7559, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768920954, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.056464) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 10085785 bytes
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.057451) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.6 rd, 92.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.9 +0.0 blob) out(9.6 +0.0 blob), read-write-amplify(24.1) write-amplify(10.3) OK, records in: 8052, records dropped: 493 output_compression: NoCompression
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.057467) EVENT_LOG_v1 {"time_micros": 1768920955057459, "job": 60, "event": "compaction_finished", "compaction_time_micros": 108809, "compaction_time_cpu_micros": 26496, "output_level": 6, "num_output_files": 1, "total_output_size": 10085785, "num_input_records": 8052, "num_output_records": 7559, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955057702, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955059782, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:54.947359) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.059851) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.059857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.059858) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.059859) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:55:55.059861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.239 226890 INFO nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Creating config drive at /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.244 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsfcinl19 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:55.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.374 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsfcinl19" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.405 226890 DEBUG nova.storage.rbd_utils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.408 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.475 226890 DEBUG nova.network.neutron [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Updated VIF entry in instance network info cache for port 852888ae-cf7b-4cbe-b96c-3b75b073d386. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.476 226890 DEBUG nova.network.neutron [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Updating instance_info_cache with network_info: [{"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.498 226890 DEBUG oslo_concurrency.lockutils [req-46c79971-ebc7-4111-af41-ac6df84a3c0b req-be66cd4f-eda2-40f9-ad13-d1ff79013e6e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3339fe13-fddd-4233-9eac-bb4dbce1c777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.577 226890 DEBUG oslo_concurrency.processutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config 3339fe13-fddd-4233-9eac-bb4dbce1c777_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.169s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.577 226890 INFO nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Deleting local config drive /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777/disk.config because it was imported into RBD.#033[00m
Jan 20 09:55:55 np0005588920 kernel: tap852888ae-cf: entered promiscuous mode
Jan 20 09:55:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:55Z|00599|binding|INFO|Claiming lport 852888ae-cf7b-4cbe-b96c-3b75b073d386 for this chassis.
Jan 20 09:55:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:55Z|00600|binding|INFO|852888ae-cf7b-4cbe-b96c-3b75b073d386: Claiming fa:16:3e:fe:3f:5f 10.100.0.12
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.6216] manager: (tap852888ae-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.622 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.628 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:3f:5f 10.100.0.12'], port_security=['fa:16:3e:fe:3f:5f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3339fe13-fddd-4233-9eac-bb4dbce1c777', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '2', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=852888ae-cf7b-4cbe-b96c-3b75b073d386) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.629 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 852888ae-cf7b-4cbe-b96c-3b75b073d386 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c bound to our chassis#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.631 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c#033[00m
Jan 20 09:55:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:55Z|00601|binding|INFO|Setting lport 852888ae-cf7b-4cbe-b96c-3b75b073d386 ovn-installed in OVS
Jan 20 09:55:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:55Z|00602|binding|INFO|Setting lport 852888ae-cf7b-4cbe-b96c-3b75b073d386 up in Southbound
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.641 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.643 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b154daa8-9dc5-4735-81ab-0774d315fd19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.644 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4c8474b-01 in ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.645 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4c8474b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.646 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6941b8f0-fbab-4125-a6eb-41f1bed7d292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.648 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce8be8b-f235-4a03-b55c-598926c263f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 systemd-udevd[273412]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:55:55 np0005588920 systemd-machined[196121]: New machine qemu-60-instance-00000080.
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.659 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf615a6-263b-4157-9a4d-b57aa683eb26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.6635] device (tap852888ae-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.6640] device (tap852888ae-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:55:55 np0005588920 systemd[1]: Started Virtual Machine qemu-60-instance-00000080.
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.680 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[374a0522-20a2-4106-83d0-1c65058ab85d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.707 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf9080b-8868-41a1-8f40-67dc5fdff654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.711 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b3dbc8-6caa-4d84-bbdf-23feaecf670e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.7127] manager: (tapf4c8474b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/298)
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.740 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5af7ecea-f0f4-45a4-94c0-dfb8196f7fca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.742 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b3228754-1918-4712-98bd-4f04f4f904b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.7659] device (tapf4c8474b-00): carrier: link connected
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.772 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[60abc9fd-62d3-4b01-8874-40bbcea1ef00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.789 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[305c2a76-1031-439a-8956-5e970e9962c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599745, 'reachable_time': 41025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273444, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.804 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2667acfa-6924-43aa-92a8-a0554d758df3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:a25f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 599745, 'tstamp': 599745}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273445, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.824 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[90c60ad8-cae1-4c85-91e5-55ca298e8f7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599745, 'reachable_time': 41025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273446, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.860 226890 DEBUG nova.compute.manager [req-02848c1e-9839-45ba-b20a-fe9b436fb3ec req-6a7e0d0e-2050-49ed-9e9d-d64978215b17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.860 226890 DEBUG oslo_concurrency.lockutils [req-02848c1e-9839-45ba-b20a-fe9b436fb3ec req-6a7e0d0e-2050-49ed-9e9d-d64978215b17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.861 226890 DEBUG oslo_concurrency.lockutils [req-02848c1e-9839-45ba-b20a-fe9b436fb3ec req-6a7e0d0e-2050-49ed-9e9d-d64978215b17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.861 226890 DEBUG oslo_concurrency.lockutils [req-02848c1e-9839-45ba-b20a-fe9b436fb3ec req-6a7e0d0e-2050-49ed-9e9d-d64978215b17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.861 226890 DEBUG nova.compute.manager [req-02848c1e-9839-45ba-b20a-fe9b436fb3ec req-6a7e0d0e-2050-49ed-9e9d-d64978215b17 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Processing event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.862 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a5df37-9aa2-4492-8338-6713c82691ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.929 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fad29226-49dc-4566-8cf9-e8593d0f5c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.931 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.931 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.931 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4c8474b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:55 np0005588920 kernel: tapf4c8474b-00: entered promiscuous mode
Jan 20 09:55:55 np0005588920 NetworkManager[49076]: <info>  [1768920955.9341] manager: (tapf4c8474b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.936 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4c8474b-00, col_values=(('external_ids', {'iface-id': '8c6fd3ab-70a8-4e63-99de-f2e15ac0207f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:55 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:55Z|00603|binding|INFO|Releasing lport 8c6fd3ab-70a8-4e63-99de-f2e15ac0207f from this chassis (sb_readonly=0)
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.949 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 nova_compute[226886]: 2026-01-20 14:55:55.962 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.963 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.964 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f68ce92e-0307-421d-99b1-3d9879ffb922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.964 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:55:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:55.965 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'env', 'PROCESS_TAG=haproxy-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.178 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920956.178124, 3339fe13-fddd-4233-9eac-bb4dbce1c777 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.179 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] VM Started (Lifecycle Event)#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.181 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.187 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.191 226890 INFO nova.virt.libvirt.driver [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Instance spawned successfully.#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.191 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.201 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.204 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.213 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.213 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.214 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.214 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.215 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.215 226890 DEBUG nova.virt.libvirt.driver [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.241 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.242 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920956.179125, 3339fe13-fddd-4233-9eac-bb4dbce1c777 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.242 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:55:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:56.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.299 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.303 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920956.186462, 3339fe13-fddd-4233-9eac-bb4dbce1c777 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.303 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:55:56 np0005588920 podman[273520]: 2026-01-20 14:55:56.309714792 +0000 UTC m=+0.046285395 container create 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.324 226890 INFO nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Took 6.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.324 226890 DEBUG nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.332 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.334 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:56 np0005588920 systemd[1]: Started libpod-conmon-36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4.scope.
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.361 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:56 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:55:56 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f574cffe46fd22208529b0ed0267a497dd3cf9e850aca8d6529cc979fcbf33fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:55:56 np0005588920 podman[273520]: 2026-01-20 14:55:56.283234136 +0000 UTC m=+0.019804789 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:55:56 np0005588920 podman[273520]: 2026-01-20 14:55:56.395270762 +0000 UTC m=+0.131841385 container init 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.397 226890 INFO nova.compute.manager [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Took 7.66 seconds to build instance.#033[00m
Jan 20 09:55:56 np0005588920 podman[273520]: 2026-01-20 14:55:56.404128892 +0000 UTC m=+0.140699495 container start 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:55:56 np0005588920 nova_compute[226886]: 2026-01-20 14:55:56.412 226890 DEBUG oslo_concurrency.lockutils [None req-bda0b4e2-ae68-493b-86c3-2744f9a7b5f3 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:56 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [NOTICE]   (273539) : New worker (273541) forked
Jan 20 09:55:56 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [NOTICE]   (273539) : Loading success.
Jan 20 09:55:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:57.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.805 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.934 226890 DEBUG nova.compute.manager [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.935 226890 DEBUG oslo_concurrency.lockutils [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.935 226890 DEBUG oslo_concurrency.lockutils [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.935 226890 DEBUG oslo_concurrency.lockutils [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.935 226890 DEBUG nova.compute.manager [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] No waiting events found dispatching network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:57 np0005588920 nova_compute[226886]: 2026-01-20 14:55:57.935 226890 WARNING nova.compute.manager [req-866334ab-fcb6-402d-b709-d5b8b7905dae req-fb949326-1243-4f23-ac81-74b053ecd298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received unexpected event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:55:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:58.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.824 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.825 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.825 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.825 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.825 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.826 226890 INFO nova.compute.manager [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Terminating instance#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.827 226890 DEBUG nova.compute.manager [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:55:58 np0005588920 kernel: tap852888ae-cf (unregistering): left promiscuous mode
Jan 20 09:55:58 np0005588920 NetworkManager[49076]: <info>  [1768920958.8682] device (tap852888ae-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.875 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:58Z|00604|binding|INFO|Releasing lport 852888ae-cf7b-4cbe-b96c-3b75b073d386 from this chassis (sb_readonly=0)
Jan 20 09:55:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:58Z|00605|binding|INFO|Setting lport 852888ae-cf7b-4cbe-b96c-3b75b073d386 down in Southbound
Jan 20 09:55:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:55:58Z|00606|binding|INFO|Removing iface tap852888ae-cf ovn-installed in OVS
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.877 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:58.883 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:3f:5f 10.100.0.12'], port_security=['fa:16:3e:fe:3f:5f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3339fe13-fddd-4233-9eac-bb4dbce1c777', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '4', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=852888ae-cf7b-4cbe-b96c-3b75b073d386) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:58.884 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 852888ae-cf7b-4cbe-b96c-3b75b073d386 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c unbound from our chassis#033[00m
Jan 20 09:55:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:58.886 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:55:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:58.888 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[555b155b-f04b-4b09-aa27-3525a6ac31ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:58.888 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c namespace which is not needed anymore#033[00m
Jan 20 09:55:58 np0005588920 nova_compute[226886]: 2026-01-20 14:55:58.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:58 np0005588920 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 20 09:55:58 np0005588920 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000080.scope: Consumed 3.303s CPU time.
Jan 20 09:55:58 np0005588920 systemd-machined[196121]: Machine qemu-60-instance-00000080 terminated.
Jan 20 09:55:59 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [NOTICE]   (273539) : haproxy version is 2.8.14-c23fe91
Jan 20 09:55:59 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [NOTICE]   (273539) : path to executable is /usr/sbin/haproxy
Jan 20 09:55:59 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [WARNING]  (273539) : Exiting Master process...
Jan 20 09:55:59 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [ALERT]    (273539) : Current worker (273541) exited with code 143 (Terminated)
Jan 20 09:55:59 np0005588920 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[273535]: [WARNING]  (273539) : All workers exited. Exiting... (0)
Jan 20 09:55:59 np0005588920 systemd[1]: libpod-36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4.scope: Deactivated successfully.
Jan 20 09:55:59 np0005588920 podman[273574]: 2026-01-20 14:55:59.028070132 +0000 UTC m=+0.043073584 container died 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:55:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4-userdata-shm.mount: Deactivated successfully.
Jan 20 09:55:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay-f574cffe46fd22208529b0ed0267a497dd3cf9e850aca8d6529cc979fcbf33fe-merged.mount: Deactivated successfully.
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.062 226890 INFO nova.virt.libvirt.driver [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Instance destroyed successfully.#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.062 226890 DEBUG nova.objects.instance [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'resources' on Instance uuid 3339fe13-fddd-4233-9eac-bb4dbce1c777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:59 np0005588920 podman[273574]: 2026-01-20 14:55:59.064462007 +0000 UTC m=+0.079465459 container cleanup 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:55:59 np0005588920 systemd[1]: libpod-conmon-36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4.scope: Deactivated successfully.
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.076 226890 DEBUG nova.virt.libvirt.vif [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:55:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-343728074',display_name='tempest-ServersTestJSON-server-343728074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-343728074',id=128,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:55:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-act430l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:55:56Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3339fe13-fddd-4233-9eac-bb4dbce1c777,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.077 226890 DEBUG nova.network.os_vif_util [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "address": "fa:16:3e:fe:3f:5f", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap852888ae-cf", "ovs_interfaceid": "852888ae-cf7b-4cbe-b96c-3b75b073d386", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.077 226890 DEBUG nova.network.os_vif_util [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.078 226890 DEBUG os_vif [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.079 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.080 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap852888ae-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.081 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.082 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.084 226890 INFO os_vif [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:3f:5f,bridge_name='br-int',has_traffic_filtering=True,id=852888ae-cf7b-4cbe-b96c-3b75b073d386,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap852888ae-cf')#033[00m
Jan 20 09:55:59 np0005588920 podman[273614]: 2026-01-20 14:55:59.124520889 +0000 UTC m=+0.037989651 container remove 36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.130 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3d372cee-be24-40e0-9060-9cf9732d4d58]: (4, ('Tue Jan 20 02:55:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c (36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4)\n36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4\nTue Jan 20 02:55:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c (36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4)\n36d327421dc6837cd04ba43ca73d84b1fb221c378f5f5b36d1249238e73ab4d4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.131 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[801503c5-31e4-473b-84ad-f257d9745d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.132 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:59 np0005588920 kernel: tapf4c8474b-00: left promiscuous mode
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.137 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ec3fe0-67dd-4e5b-ac53-cb33b4da35e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.149 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.150 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[80bcc322-5b1e-4a21-b276-78405fc813a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.151 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7e0c15-2e89-4d3b-942d-7c38e5858529]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.167 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a3be465f-338b-4f0d-b788-d71c40c50bd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599738, 'reachable_time': 24348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273647, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 systemd[1]: run-netns-ovnmeta\x2df4c8474b\x2d0ca3\x2d4cb0\x2db6dd\x2de6aa302def5c.mount: Deactivated successfully.
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.170 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:55:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:55:59.170 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[ecdf6048-2acf-4431-87b7-3c5e56fbb927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:55:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:59.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.444 226890 INFO nova.virt.libvirt.driver [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Deleting instance files /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777_del#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.445 226890 INFO nova.virt.libvirt.driver [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Deletion of /var/lib/nova/instances/3339fe13-fddd-4233-9eac-bb4dbce1c777_del complete#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.497 226890 INFO nova.compute.manager [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Took 0.67 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.497 226890 DEBUG oslo.service.loopingcall [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.497 226890 DEBUG nova.compute.manager [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:55:59 np0005588920 nova_compute[226886]: 2026-01-20 14:55:59.497 226890 DEBUG nova.network.neutron [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:55:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.014 226890 DEBUG nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-unplugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] No waiting events found dispatching network-vif-unplugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-unplugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.015 226890 DEBUG nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.016 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.016 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.016 226890 DEBUG oslo_concurrency.lockutils [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.016 226890 DEBUG nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] No waiting events found dispatching network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.016 226890 WARNING nova.compute.manager [req-2655dbae-6ed0-4247-9bab-f4cc79532348 req-3fce3807-b9be-423f-845c-b0de4ea542aa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received unexpected event network-vif-plugged-852888ae-cf7b-4cbe-b96c-3b75b073d386 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.174 226890 DEBUG nova.network.neutron [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.195 226890 INFO nova.compute.manager [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Took 0.70 seconds to deallocate network for instance.#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.205 226890 DEBUG nova.compute.manager [req-892a9de7-fab2-489c-b7d9-b9b899375ea2 req-a8b8b219-1a42-4419-a27d-108102498b46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Received event network-vif-deleted-852888ae-cf7b-4cbe-b96c-3b75b073d386 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.205 226890 INFO nova.compute.manager [req-892a9de7-fab2-489c-b7d9-b9b899375ea2 req-a8b8b219-1a42-4419-a27d-108102498b46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Neutron deleted interface 852888ae-cf7b-4cbe-b96c-3b75b073d386; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.206 226890 DEBUG nova.network.neutron [req-892a9de7-fab2-489c-b7d9-b9b899375ea2 req-a8b8b219-1a42-4419-a27d-108102498b46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.228 226890 DEBUG nova.compute.manager [req-892a9de7-fab2-489c-b7d9-b9b899375ea2 req-a8b8b219-1a42-4419-a27d-108102498b46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Detach interface failed, port_id=852888ae-cf7b-4cbe-b96c-3b75b073d386, reason: Instance 3339fe13-fddd-4233-9eac-bb4dbce1c777 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.245 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.246 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:00.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.341 226890 DEBUG oslo_concurrency.processutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:00 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/242987236' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.783 226890 DEBUG oslo_concurrency.processutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.789 226890 DEBUG nova.compute.provider_tree [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.804 226890 DEBUG nova.scheduler.client.report [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.826 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.848 226890 INFO nova.scheduler.client.report [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Deleted allocations for instance 3339fe13-fddd-4233-9eac-bb4dbce1c777#033[00m
Jan 20 09:56:00 np0005588920 nova_compute[226886]: 2026-01-20 14:56:00.908 226890 DEBUG oslo_concurrency.lockutils [None req-144217ae-eb83-4fdd-bd02-5ba1397028a9 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3339fe13-fddd-4233-9eac-bb4dbce1c777" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.069 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.069 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.086 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.146 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.191 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.192 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.199 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.200 226890 INFO nova.compute.claims [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:56:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:01.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.354 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2444160729' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.801 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.809 226890 DEBUG nova.compute.provider_tree [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.827 226890 DEBUG nova.scheduler.client.report [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.861 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.862 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.927 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.928 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.947 226890 INFO nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:56:01 np0005588920 podman[273693]: 2026-01-20 14:56:01.955186482 +0000 UTC m=+0.044867395 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:56:01 np0005588920 nova_compute[226886]: 2026-01-20 14:56:01.966 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.057 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.060 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.061 226890 INFO nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating image(s)#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.091 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.119 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.147 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.151 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.174 226890 DEBUG nova.policy [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '215db37373dc4ae5a75cbd6866f471da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.208 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.209 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.209 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.209 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.233 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.237 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cf24bde1-0912-4d63-8959-6799ae8ab043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:02.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.558 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cf24bde1-0912-4d63-8959-6799ae8ab043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.616 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] resizing rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.707 226890 DEBUG nova.objects.instance [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'migration_context' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.721 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.721 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Ensure instance console log exists: /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.722 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.722 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:02 np0005588920 nova_compute[226886]: 2026-01-20 14:56:02.722 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:03.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:03 np0005588920 nova_compute[226886]: 2026-01-20 14:56:03.967 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Successfully created port: 0293f4ad-1248-4899-81ef-32e616d9a754 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:04.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.777 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Successfully updated port: 0293f4ad-1248-4899-81ef-32e616d9a754 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.795 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.796 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.796 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.942 226890 DEBUG nova.compute.manager [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.943 226890 DEBUG nova.compute.manager [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing instance network info cache due to event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:04 np0005588920 nova_compute[226886]: 2026-01-20 14:56:04.944 226890 DEBUG oslo_concurrency.lockutils [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:05 np0005588920 nova_compute[226886]: 2026-01-20 14:56:05.001 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:56:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:05.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.039 226890 DEBUG nova.network.neutron [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.073 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.074 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance network_info: |[{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.074 226890 DEBUG oslo_concurrency.lockutils [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.074 226890 DEBUG nova.network.neutron [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.077 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start _get_guest_xml network_info=[{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.081 226890 WARNING nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.087 226890 DEBUG nova.virt.libvirt.host [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.087 226890 DEBUG nova.virt.libvirt.host [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.090 226890 DEBUG nova.virt.libvirt.host [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.090 226890 DEBUG nova.virt.libvirt.host [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.091 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.092 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.092 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.092 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.092 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.093 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.093 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.093 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.093 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.094 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.094 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.094 226890 DEBUG nova.virt.hardware [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.096 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.149 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:06.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1903058217' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.542 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.577 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:06 np0005588920 nova_compute[226886]: 2026-01-20 14:56:06.582 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3304163422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.003 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.005 226890 DEBUG nova.virt.libvirt.vif [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.005 226890 DEBUG nova.network.os_vif_util [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.006 226890 DEBUG nova.network.os_vif_util [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.007 226890 DEBUG nova.objects.instance [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.023 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <uuid>cf24bde1-0912-4d63-8959-6799ae8ab043</uuid>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <name>instance-00000082</name>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestOtherB-server-1938504336</nova:name>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:56:06</nova:creationTime>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:user uuid="215db37373dc4ae5a75cbd6866f471da">tempest-ServerActionsTestOtherB-1136521362-project-member</nova:user>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:project uuid="b3b1b7f5b4f84b5abbc401eb577c85c0">tempest-ServerActionsTestOtherB-1136521362</nova:project>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <nova:port uuid="0293f4ad-1248-4899-81ef-32e616d9a754">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="serial">cf24bde1-0912-4d63-8959-6799ae8ab043</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="uuid">cf24bde1-0912-4d63-8959-6799ae8ab043</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:a8:aa:73"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <target dev="tap0293f4ad-12"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/console.log" append="off"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:56:07 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:56:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:56:07 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:56:07 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.026 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Preparing to wait for external event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.027 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.027 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.028 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.030 226890 DEBUG nova.virt.libvirt.vif [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.030 226890 DEBUG nova.network.os_vif_util [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.032 226890 DEBUG nova.network.os_vif_util [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.033 226890 DEBUG os_vif [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.034 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.035 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.036 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.043 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.044 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0293f4ad-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.045 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0293f4ad-12, col_values=(('external_ids', {'iface-id': '0293f4ad-1248-4899-81ef-32e616d9a754', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:aa:73', 'vm-uuid': 'cf24bde1-0912-4d63-8959-6799ae8ab043'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588920 NetworkManager[49076]: <info>  [1768920967.0487] manager: (tap0293f4ad-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.050 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.056 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.059 226890 INFO os_vif [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12')#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.126 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.127 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.127 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No VIF found with MAC fa:16:3e:a8:aa:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.127 226890 INFO nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Using config drive#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.155 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:07.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.557 226890 DEBUG nova.network.neutron [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updated VIF entry in instance network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.558 226890 DEBUG nova.network.neutron [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:07 np0005588920 nova_compute[226886]: 2026-01-20 14:56:07.591 226890 DEBUG oslo_concurrency.lockutils [req-21ebf925-20ed-44ef-992d-bef82cb29d4c req-74457d88-e6cb-4fec-9c14-232c3f7989fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.228 226890 INFO nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating config drive at /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.240 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2_nupyiw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:08.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.395 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2_nupyiw" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.426 226890 DEBUG nova.storage.rbd_utils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.430 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.578 226890 DEBUG oslo_concurrency.processutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.580 226890 INFO nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deleting local config drive /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config because it was imported into RBD.#033[00m
Jan 20 09:56:08 np0005588920 kernel: tap0293f4ad-12: entered promiscuous mode
Jan 20 09:56:08 np0005588920 NetworkManager[49076]: <info>  [1768920968.6283] manager: (tap0293f4ad-12): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Jan 20 09:56:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:08Z|00607|binding|INFO|Claiming lport 0293f4ad-1248-4899-81ef-32e616d9a754 for this chassis.
Jan 20 09:56:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:08Z|00608|binding|INFO|0293f4ad-1248-4899-81ef-32e616d9a754: Claiming fa:16:3e:a8:aa:73 10.100.0.12
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.629 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.635 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:aa:73 10.100.0.12'], port_security=['fa:16:3e:a8:aa:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cf24bde1-0912-4d63-8959-6799ae8ab043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0293f4ad-1248-4899-81ef-32e616d9a754) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.636 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0293f4ad-1248-4899-81ef-32e616d9a754 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce bound to our chassis#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.638 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:56:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:08Z|00609|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 ovn-installed in OVS
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.654 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1457de94-e69f-4541-ba17-abbf9eca8a2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:08Z|00610|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 up in Southbound
Jan 20 09:56:08 np0005588920 systemd-udevd[274016]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:08 np0005588920 NetworkManager[49076]: <info>  [1768920968.6776] device (tap0293f4ad-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:56:08 np0005588920 NetworkManager[49076]: <info>  [1768920968.6794] device (tap0293f4ad-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.703 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588920 systemd-machined[196121]: New machine qemu-61-instance-00000082.
Jan 20 09:56:08 np0005588920 systemd[1]: Started Virtual Machine qemu-61-instance-00000082.
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.733 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4b664c97-5558-4c4f-bcd7-df76ae17689e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.735 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[824fe440-6a27-4e01-9dcb-924c0c76a028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.765 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e4102772-a853-42f7-a3ed-b760c4cd3fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.781 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e27353b1-ad96-4a55-8eed-de2458dad9ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274028, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.795 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ad6e95-43a6-4a65-a355-38d0560e8f01]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585752, 'tstamp': 585752}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274030, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585755, 'tstamp': 585755}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274030, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.796 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.797 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588920 nova_compute[226886]: 2026-01-20 14:56:08.798 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.798 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.799 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.799 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:08.799 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.256 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920969.2558162, cf24bde1-0912-4d63-8959-6799ae8ab043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.256 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Started (Lifecycle Event)#033[00m
Jan 20 09:56:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:09.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.303 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.307 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920969.2560205, cf24bde1-0912-4d63-8959-6799ae8ab043 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.307 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.330 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.333 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.362 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.423 226890 DEBUG nova.compute.manager [req-0eab3435-dad4-4d5f-aa99-e145ee9b5547 req-c9a3889a-40c3-4b10-ab54-c0ebe43055bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.424 226890 DEBUG oslo_concurrency.lockutils [req-0eab3435-dad4-4d5f-aa99-e145ee9b5547 req-c9a3889a-40c3-4b10-ab54-c0ebe43055bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.424 226890 DEBUG oslo_concurrency.lockutils [req-0eab3435-dad4-4d5f-aa99-e145ee9b5547 req-c9a3889a-40c3-4b10-ab54-c0ebe43055bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.424 226890 DEBUG oslo_concurrency.lockutils [req-0eab3435-dad4-4d5f-aa99-e145ee9b5547 req-c9a3889a-40c3-4b10-ab54-c0ebe43055bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.425 226890 DEBUG nova.compute.manager [req-0eab3435-dad4-4d5f-aa99-e145ee9b5547 req-c9a3889a-40c3-4b10-ab54-c0ebe43055bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Processing event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.425 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.430 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768920969.4289038, cf24bde1-0912-4d63-8959-6799ae8ab043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.430 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.431 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.434 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance spawned successfully.#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.435 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.454 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.461 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.465 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.466 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.466 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.467 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.467 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.468 226890 DEBUG nova.virt.libvirt.driver [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.529 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.632 226890 INFO nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Took 7.57 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.633 226890 DEBUG nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.715 226890 INFO nova.compute.manager [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Took 8.55 seconds to build instance.#033[00m
Jan 20 09:56:09 np0005588920 nova_compute[226886]: 2026-01-20 14:56:09.754 226890 DEBUG oslo_concurrency.lockutils [None req-567e97a5-792b-42f8-80dc-d8bace5cf166 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:10.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:11 np0005588920 nova_compute[226886]: 2026-01-20 14:56:11.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:11.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:12.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.448 226890 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.448 226890 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.448 226890 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.448 226890 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.449 226890 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.449 226890 WARNING nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received unexpected event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.605 226890 DEBUG nova.compute.manager [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.605 226890 DEBUG nova.compute.manager [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing instance network info cache due to event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.605 226890 DEBUG oslo_concurrency.lockutils [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.605 226890 DEBUG oslo_concurrency.lockutils [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:12 np0005588920 nova_compute[226886]: 2026-01-20 14:56:12.606 226890 DEBUG nova.network.neutron [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:13.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:56:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1356210779' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:56:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:56:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1356210779' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.061 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920959.0606306, 3339fe13-fddd-4233-9eac-bb4dbce1c777 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.062 226890 INFO nova.compute.manager [-] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.094 226890 DEBUG nova.compute.manager [None req-d3133cd6-7a32-40c5-b8f1-512c6ebdc508 - - - - - -] [instance: 3339fe13-fddd-4233-9eac-bb4dbce1c777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:14.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.471 226890 DEBUG nova.network.neutron [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updated VIF entry in instance network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.472 226890 DEBUG nova.network.neutron [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:14 np0005588920 nova_compute[226886]: 2026-01-20 14:56:14.495 226890 DEBUG oslo_concurrency.lockutils [req-fa658955-b7f0-40a8-874d-abfa22909e06 req-a45a90b5-2aaa-4fe8-8bb9-1cc1d5138d3a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:15.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:16 np0005588920 nova_compute[226886]: 2026-01-20 14:56:16.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:16.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:56:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3749428790' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:56:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:56:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3749428790' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:56:17 np0005588920 nova_compute[226886]: 2026-01-20 14:56:17.048 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:17.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:18.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:19.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:20.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:20 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:20Z|00611|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:56:20 np0005588920 nova_compute[226886]: 2026-01-20 14:56:20.868 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:21 np0005588920 nova_compute[226886]: 2026-01-20 14:56:21.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:21.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:22 np0005588920 nova_compute[226886]: 2026-01-20 14:56:22.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:22Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:aa:73 10.100.0.12
Jan 20 09:56:22 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:22Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:aa:73 10.100.0.12
Jan 20 09:56:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:23.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:24 np0005588920 ovn_controller[133971]: 2026-01-20T14:56:24Z|00612|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:56:24 np0005588920 nova_compute[226886]: 2026-01-20 14:56:24.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:24.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:24 np0005588920 podman[274099]: 2026-01-20 14:56:24.804078739 +0000 UTC m=+0.093288689 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:56:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:56:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:25.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:56:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:56:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:56:26 np0005588920 nova_compute[226886]: 2026-01-20 14:56:26.235 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:26.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:27 np0005588920 nova_compute[226886]: 2026-01-20 14:56:27.052 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:27.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:28.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:29.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:30.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:31 np0005588920 nova_compute[226886]: 2026-01-20 14:56:31.238 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:31.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:32 np0005588920 nova_compute[226886]: 2026-01-20 14:56:32.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:32.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:32 np0005588920 podman[274282]: 2026-01-20 14:56:32.968090743 +0000 UTC m=+0.054238199 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:56:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:56:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/863845729' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:56:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:56:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/863845729' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:56:33 np0005588920 nova_compute[226886]: 2026-01-20 14:56:33.863 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:34.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:35.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:36 np0005588920 nova_compute[226886]: 2026-01-20 14:56:36.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:56:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:56:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:37.017 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:37 np0005588920 nova_compute[226886]: 2026-01-20 14:56:37.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:37.019 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:56:37 np0005588920 nova_compute[226886]: 2026-01-20 14:56:37.055 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:37.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:56:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3611847006' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:56:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:56:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3611847006' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:56:38 np0005588920 nova_compute[226886]: 2026-01-20 14:56:38.036 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:38.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:38 np0005588920 nova_compute[226886]: 2026-01-20 14:56:38.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:39.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:40.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:41 np0005588920 nova_compute[226886]: 2026-01-20 14:56:41.243 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:41.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:42 np0005588920 nova_compute[226886]: 2026-01-20 14:56:42.058 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:42.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:43.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.991 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.991 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.991 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:56:43 np0005588920 nova_compute[226886]: 2026-01-20 14:56:43.991 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce0152a6-7d4d-4eac-9587-a43ad934d9cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:44.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:56:45.020 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:45.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.238 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [{"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.245 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.264 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-ce0152a6-7d4d-4eac-9587-a43ad934d9cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.264 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.265 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.265 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.298 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.300 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.300 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.300 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.301 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:46.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1572009590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.801 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.887 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.888 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.891 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:46 np0005588920 nova_compute[226886]: 2026-01-20 14:56:46.891 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.049 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.050 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4023MB free_disk=20.781208038330078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.050 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.050 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.174 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.174 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance cf24bde1-0912-4d63-8959-6799ae8ab043 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.175 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.175 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.253 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:47.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:47 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1427266879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.701 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.705 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.726 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.753 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:56:47 np0005588920 nova_compute[226886]: 2026-01-20 14:56:47.753 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 20 09:56:48 np0005588920 nova_compute[226886]: 2026-01-20 14:56:48.214 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:48 np0005588920 nova_compute[226886]: 2026-01-20 14:56:48.214 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:48 np0005588920 nova_compute[226886]: 2026-01-20 14:56:48.214 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:48.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:48 np0005588920 nova_compute[226886]: 2026-01-20 14:56:48.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:48 np0005588920 nova_compute[226886]: 2026-01-20 14:56:48.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:56:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:49.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:50.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 20 09:56:50 np0005588920 nova_compute[226886]: 2026-01-20 14:56:50.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:50 np0005588920 nova_compute[226886]: 2026-01-20 14:56:50.742 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:51 np0005588920 nova_compute[226886]: 2026-01-20 14:56:51.247 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:56:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:51.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:56:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 20 09:56:52 np0005588920 nova_compute[226886]: 2026-01-20 14:56:52.063 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:52.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:52 np0005588920 nova_compute[226886]: 2026-01-20 14:56:52.813 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:53.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:54.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:55 np0005588920 podman[274347]: 2026-01-20 14:56:55.046595116 +0000 UTC m=+0.126160015 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:56:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:55.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:56 np0005588920 nova_compute[226886]: 2026-01-20 14:56:56.248 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:56.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:57 np0005588920 nova_compute[226886]: 2026-01-20 14:56:57.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:57.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:58.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:58 np0005588920 nova_compute[226886]: 2026-01-20 14:56:58.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:56:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:59.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 20 09:57:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:00.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:01 np0005588920 nova_compute[226886]: 2026-01-20 14:57:01.249 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:01.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:02 np0005588920 nova_compute[226886]: 2026-01-20 14:57:02.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:02 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 20 09:57:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:02.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:03.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:03 np0005588920 podman[274373]: 2026-01-20 14:57:03.980035553 +0000 UTC m=+0.052122429 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 20 09:57:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:04.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:05 np0005588920 nova_compute[226886]: 2026-01-20 14:57:05.034 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:05 np0005588920 nova_compute[226886]: 2026-01-20 14:57:05.035 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:05 np0005588920 nova_compute[226886]: 2026-01-20 14:57:05.036 226890 INFO nova.compute.manager [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Shelving#033[00m
Jan 20 09:57:05 np0005588920 nova_compute[226886]: 2026-01-20 14:57:05.071 226890 DEBUG nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:57:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:05.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:06 np0005588920 nova_compute[226886]: 2026-01-20 14:57:06.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:06.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:07.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:07 np0005588920 kernel: tap0293f4ad-12 (unregistering): left promiscuous mode
Jan 20 09:57:07 np0005588920 NetworkManager[49076]: <info>  [1768921027.4997] device (tap0293f4ad-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:57:07 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:07Z|00613|binding|INFO|Releasing lport 0293f4ad-1248-4899-81ef-32e616d9a754 from this chassis (sb_readonly=0)
Jan 20 09:57:07 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:07Z|00614|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 down in Southbound
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:07Z|00615|binding|INFO|Removing iface tap0293f4ad-12 ovn-installed in OVS
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.521 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:aa:73 10.100.0.12'], port_security=['fa:16:3e:a8:aa:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cf24bde1-0912-4d63-8959-6799ae8ab043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0293f4ad-1248-4899-81ef-32e616d9a754) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.523 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0293f4ad-1248-4899-81ef-32e616d9a754 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce unbound from our chassis#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.525 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.525 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.544 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[da230207-35a3-4922-8d4f-ef769386c898]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 20 09:57:07 np0005588920 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000082.scope: Consumed 15.281s CPU time.
Jan 20 09:57:07 np0005588920 systemd-machined[196121]: Machine qemu-61-instance-00000082 terminated.
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.577 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7e133e-cbd3-4537-a7a7-9c5323ada949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.580 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9f16e60e-ea1b-410f-b1a0-a48cfee070f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.608 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1094a552-d369-4d86-9472-f3b508cef620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.626 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c9ae9e-9e3d-49d0-8249-8b6b07ed55c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274403, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.645 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[79945ab8-b308-41c6-b839-f25f1978a0f2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585752, 'tstamp': 585752}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274404, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585755, 'tstamp': 585755}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274404, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.647 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 nova_compute[226886]: 2026-01-20 14:57:07.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.654 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.654 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.654 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:07.654 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.094 226890 INFO nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.103 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance destroyed successfully.#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.104 226890 DEBUG nova.objects.instance [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:08.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.359 226890 INFO nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Beginning cold snapshot process#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.506 226890 DEBUG nova.compute.manager [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.507 226890 DEBUG oslo_concurrency.lockutils [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.507 226890 DEBUG oslo_concurrency.lockutils [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.507 226890 DEBUG oslo_concurrency.lockutils [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.507 226890 DEBUG nova.compute.manager [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.507 226890 WARNING nova.compute.manager [req-accca793-cef5-4fa7-a6d6-7b279ac286d2 req-38481675-beaa-4e7b-8860-53b0eb9d583b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received unexpected event network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.512 226890 DEBUG nova.virt.libvirt.imagebackend [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:57:08 np0005588920 nova_compute[226886]: 2026-01-20 14:57:08.774 226890 DEBUG nova.storage.rbd_utils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] creating snapshot(1b20e66174b444eb88c88d653a2afb11) on rbd image(cf24bde1-0912-4d63-8959-6799ae8ab043_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:57:09 np0005588920 nova_compute[226886]: 2026-01-20 14:57:09.079 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 20 09:57:09 np0005588920 nova_compute[226886]: 2026-01-20 14:57:09.221 226890 DEBUG nova.storage.rbd_utils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] cloning vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk@1b20e66174b444eb88c88d653a2afb11 to images/1f2389f5-ba66-4c2e-9f07-9051cd05b976 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:57:09 np0005588920 nova_compute[226886]: 2026-01-20 14:57:09.358 226890 DEBUG nova.storage.rbd_utils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] flattening images/1f2389f5-ba66-4c2e-9f07-9051cd05b976 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:57:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:09.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:09 np0005588920 nova_compute[226886]: 2026-01-20 14:57:09.754 226890 DEBUG nova.storage.rbd_utils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] removing snapshot(1b20e66174b444eb88c88d653a2afb11) on rbd image(cf24bde1-0912-4d63-8959-6799ae8ab043_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:57:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.236 226890 DEBUG nova.storage.rbd_utils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] creating snapshot(snap) on rbd image(1f2389f5-ba66-4c2e-9f07-9051cd05b976) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:57:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:10.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.623 226890 DEBUG nova.compute.manager [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.624 226890 DEBUG oslo_concurrency.lockutils [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.624 226890 DEBUG oslo_concurrency.lockutils [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.625 226890 DEBUG oslo_concurrency.lockutils [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.626 226890 DEBUG nova.compute.manager [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:10 np0005588920 nova_compute[226886]: 2026-01-20 14:57:10.626 226890 WARNING nova.compute.manager [req-7116e72f-d323-41a8-aeb1-c558708fd280 req-0cb06df1-7157-4c87-acf1-ffe7960ad2a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received unexpected event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:57:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 20 09:57:11 np0005588920 nova_compute[226886]: 2026-01-20 14:57:11.254 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:11.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:12.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.902 226890 INFO nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Snapshot image upload complete#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.903 226890 DEBUG nova.compute.manager [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.965 226890 INFO nova.compute.manager [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Shelve offloading#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.972 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance destroyed successfully.#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.972 226890 DEBUG nova.compute.manager [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.975 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.975 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:12 np0005588920 nova_compute[226886]: 2026-01-20 14:57:12.975 226890 DEBUG nova.network.neutron [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:57:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:13.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:14.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:15.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:15 np0005588920 nova_compute[226886]: 2026-01-20 14:57:15.705 226890 DEBUG nova.network.neutron [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:15 np0005588920 nova_compute[226886]: 2026-01-20 14:57:15.707 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:15 np0005588920 nova_compute[226886]: 2026-01-20 14:57:15.727 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:16.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:16.458 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:16.459 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:16.459 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.846 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance destroyed successfully.#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.847 226890 DEBUG nova.objects.instance [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.864 226890 DEBUG nova.virt.libvirt.vif [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member',shelved_at='2026-01-20T14:57:12.903385',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='1f2389f5-ba66-4c2e-9f07-9051cd05b976'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.865 226890 DEBUG nova.network.os_vif_util [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.865 226890 DEBUG nova.network.os_vif_util [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.866 226890 DEBUG os_vif [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.867 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.867 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0293f4ad-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.874 226890 INFO os_vif [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12')#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.935 226890 DEBUG nova.compute.manager [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.936 226890 DEBUG nova.compute.manager [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing instance network info cache due to event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.936 226890 DEBUG oslo_concurrency.lockutils [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.936 226890 DEBUG oslo_concurrency.lockutils [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:16 np0005588920 nova_compute[226886]: 2026-01-20 14:57:16.936 226890 DEBUG nova.network.neutron [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.362 226890 INFO nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deleting instance files /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043_del#033[00m
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.362 226890 INFO nova.virt.libvirt.driver [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deletion of /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043_del complete#033[00m
Jan 20 09:57:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:17.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.441 226890 INFO nova.scheduler.client.report [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Deleted allocations for instance cf24bde1-0912-4d63-8959-6799ae8ab043#033[00m
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.498 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.499 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:17 np0005588920 nova_compute[226886]: 2026-01-20 14:57:17.581 226890 DEBUG oslo_concurrency.processutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4255108125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:18 np0005588920 nova_compute[226886]: 2026-01-20 14:57:18.044 226890 DEBUG oslo_concurrency.processutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:18 np0005588920 nova_compute[226886]: 2026-01-20 14:57:18.050 226890 DEBUG nova.compute.provider_tree [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:18 np0005588920 nova_compute[226886]: 2026-01-20 14:57:18.069 226890 DEBUG nova.scheduler.client.report [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:18 np0005588920 nova_compute[226886]: 2026-01-20 14:57:18.120 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 20 09:57:18 np0005588920 nova_compute[226886]: 2026-01-20 14:57:18.204 226890 DEBUG oslo_concurrency.lockutils [None req-7e50f743-e681-4f69-a71b-6050b489dde2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 13.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:18.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:19.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:19 np0005588920 nova_compute[226886]: 2026-01-20 14:57:19.876 226890 DEBUG nova.network.neutron [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updated VIF entry in instance network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:19 np0005588920 nova_compute[226886]: 2026-01-20 14:57:19.877 226890 DEBUG nova.network.neutron [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap0293f4ad-12", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:19 np0005588920 nova_compute[226886]: 2026-01-20 14:57:19.899 226890 DEBUG oslo_concurrency.lockutils [req-2b0974a4-12ae-4a4a-a350-e7def9f2bc4f req-f463a294-623d-4a44-a693-359e7460a226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 20 09:57:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:20.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:21 np0005588920 nova_compute[226886]: 2026-01-20 14:57:21.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:21.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:21 np0005588920 nova_compute[226886]: 2026-01-20 14:57:21.870 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:22.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:22 np0005588920 nova_compute[226886]: 2026-01-20 14:57:22.771 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921027.770343, cf24bde1-0912-4d63-8959-6799ae8ab043 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:22 np0005588920 nova_compute[226886]: 2026-01-20 14:57:22.771 226890 INFO nova.compute.manager [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:57:22 np0005588920 nova_compute[226886]: 2026-01-20 14:57:22.798 226890 DEBUG nova.compute.manager [None req-2e0927e2-f16d-4aba-ab97-004e25f2939e - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.180 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.181 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.181 226890 INFO nova.compute.manager [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Unshelving#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.284 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.284 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.289 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_requests' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.303 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.323 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.324 226890 INFO nova.compute.claims [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:57:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.489 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4028243345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.919 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.927 226890 DEBUG nova.compute.provider_tree [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.956 226890 DEBUG nova.scheduler.client.report [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:23 np0005588920 nova_compute[226886]: 2026-01-20 14:57:23.984 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:24 np0005588920 nova_compute[226886]: 2026-01-20 14:57:24.250 226890 INFO nova.network.neutron [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating port 0293f4ad-1248-4899-81ef-32e616d9a754 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:57:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:24.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.147 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.148 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.148 226890 DEBUG nova.network.neutron [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:57:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:25.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.495 226890 DEBUG nova.compute.manager [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.496 226890 DEBUG nova.compute.manager [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing instance network info cache due to event network-changed-0293f4ad-1248-4899-81ef-32e616d9a754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:25 np0005588920 nova_compute[226886]: 2026-01-20 14:57:25.496 226890 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:25 np0005588920 podman[274622]: 2026-01-20 14:57:25.993241688 +0000 UTC m=+0.081755734 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:57:26 np0005588920 nova_compute[226886]: 2026-01-20 14:57:26.258 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:26.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:26 np0005588920 nova_compute[226886]: 2026-01-20 14:57:26.872 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:27.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.822 226890 DEBUG nova.network.neutron [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.843 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.845 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.845 226890 INFO nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating image(s)#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.870 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.874 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.875 226890 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.876 226890 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Refreshing network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.920 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.943 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.946 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "3a32ce23ec2d5c4a57b0ad674e3fb972f597b06f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:27 np0005588920 nova_compute[226886]: 2026-01-20 14:57:27.947 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "3a32ce23ec2d5c4a57b0ad674e3fb972f597b06f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.361 226890 DEBUG nova.virt.libvirt.imagebackend [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/1f2389f5-ba66-4c2e-9f07-9051cd05b976/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/1f2389f5-ba66-4c2e-9f07-9051cd05b976/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.436 226890 DEBUG nova.virt.libvirt.imagebackend [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/1f2389f5-ba66-4c2e-9f07-9051cd05b976/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.437 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] cloning images/1f2389f5-ba66-4c2e-9f07-9051cd05b976@snap to None/cf24bde1-0912-4d63-8959-6799ae8ab043_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.546 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "3a32ce23ec2d5c4a57b0ad674e3fb972f597b06f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.657 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'migration_context' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:28 np0005588920 nova_compute[226886]: 2026-01-20 14:57:28.713 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] flattening vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.103 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Image rbd:vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.104 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.104 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Ensure instance console log exists: /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.104 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.105 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.105 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.107 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start _get_guest_xml network_info=[{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:57:04Z,direct_url=<?>,disk_format='raw',id=1f2389f5-ba66-4c2e-9f07-9051cd05b976,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1938504336-shelved',owner='b3b1b7f5b4f84b5abbc401eb577c85c0',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:57:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.116 226890 WARNING nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.121 226890 DEBUG nova.virt.libvirt.host [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.123 226890 DEBUG nova.virt.libvirt.host [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.126 226890 DEBUG nova.virt.libvirt.host [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.127 226890 DEBUG nova.virt.libvirt.host [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.129 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.129 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:57:04Z,direct_url=<?>,disk_format='raw',id=1f2389f5-ba66-4c2e-9f07-9051cd05b976,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1938504336-shelved',owner='b3b1b7f5b4f84b5abbc401eb577c85c0',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:57:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.130 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.130 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.130 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.131 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.131 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.131 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.132 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.132 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.132 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.133 226890 DEBUG nova.virt.hardware [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.133 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.155 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:29.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2445595473' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.606 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.633 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.638 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.681 226890 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updated VIF entry in instance network info cache for port 0293f4ad-1248-4899-81ef-32e616d9a754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.682 226890 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [{"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:29 np0005588920 nova_compute[226886]: 2026-01-20 14:57:29.703 226890 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.109 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.111 226890 DEBUG nova.virt.libvirt.vif [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='1f2389f5-ba66-4c2e-9f07-9051cd05b976',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member',shelved_at='2026-01-20T14:57:12.903385',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='1f2389f5-ba66-4c2e-9f07-9051cd05b976'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.111 226890 DEBUG nova.network.os_vif_util [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.112 226890 DEBUG nova.network.os_vif_util [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.113 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.134 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <uuid>cf24bde1-0912-4d63-8959-6799ae8ab043</uuid>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <name>instance-00000082</name>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerActionsTestOtherB-server-1938504336</nova:name>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:57:29</nova:creationTime>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:user uuid="215db37373dc4ae5a75cbd6866f471da">tempest-ServerActionsTestOtherB-1136521362-project-member</nova:user>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:project uuid="b3b1b7f5b4f84b5abbc401eb577c85c0">tempest-ServerActionsTestOtherB-1136521362</nova:project>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="1f2389f5-ba66-4c2e-9f07-9051cd05b976"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <nova:port uuid="0293f4ad-1248-4899-81ef-32e616d9a754">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="serial">cf24bde1-0912-4d63-8959-6799ae8ab043</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="uuid">cf24bde1-0912-4d63-8959-6799ae8ab043</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:a8:aa:73"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <target dev="tap0293f4ad-12"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/console.log" append="off"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:57:30 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:57:30 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:57:30 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:57:30 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.136 226890 DEBUG nova.compute.manager [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Preparing to wait for external event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.137 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.137 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.137 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.138 226890 DEBUG nova.virt.libvirt.vif [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='1f2389f5-ba66-4c2e-9f07-9051cd05b976',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member',shelved_at='2026-01-20T14:57:12.903385',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='1f2389f5-ba66-4c2e-9f07-9051cd05b976'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.138 226890 DEBUG nova.network.os_vif_util [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.139 226890 DEBUG nova.network.os_vif_util [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.139 226890 DEBUG os_vif [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.141 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.142 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.145 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.145 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0293f4ad-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.146 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0293f4ad-12, col_values=(('external_ids', {'iface-id': '0293f4ad-1248-4899-81ef-32e616d9a754', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:aa:73', 'vm-uuid': 'cf24bde1-0912-4d63-8959-6799ae8ab043'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:30 np0005588920 NetworkManager[49076]: <info>  [1768921050.1489] manager: (tap0293f4ad-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.153 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.153 226890 INFO os_vif [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12')#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.218 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.219 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.219 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No VIF found with MAC fa:16:3e:a8:aa:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.220 226890 INFO nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Using config drive#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.252 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.279 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.332 226890 DEBUG nova.objects.instance [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'keypairs' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:30.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.712 226890 INFO nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Creating config drive at /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.717 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcnndltj4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.854 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcnndltj4" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.891 226890 DEBUG nova.storage.rbd_utils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] rbd image cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:30 np0005588920 nova_compute[226886]: 2026-01-20 14:57:30.895 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.069 226890 DEBUG oslo_concurrency.processutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config cf24bde1-0912-4d63-8959-6799ae8ab043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.070 226890 INFO nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deleting local config drive /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043/disk.config because it was imported into RBD.#033[00m
Jan 20 09:57:31 np0005588920 kernel: tap0293f4ad-12: entered promiscuous mode
Jan 20 09:57:31 np0005588920 NetworkManager[49076]: <info>  [1768921051.1192] manager: (tap0293f4ad-12): new Tun device (/org/freedesktop/NetworkManager/Devices/303)
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.117 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:31Z|00616|binding|INFO|Claiming lport 0293f4ad-1248-4899-81ef-32e616d9a754 for this chassis.
Jan 20 09:57:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:31Z|00617|binding|INFO|0293f4ad-1248-4899-81ef-32e616d9a754: Claiming fa:16:3e:a8:aa:73 10.100.0.12
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.129 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:aa:73 10.100.0.12'], port_security=['fa:16:3e:a8:aa:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cf24bde1-0912-4d63-8959-6799ae8ab043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '7', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0293f4ad-1248-4899-81ef-32e616d9a754) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.131 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0293f4ad-1248-4899-81ef-32e616d9a754 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce bound to our chassis#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.133 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:57:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:31Z|00618|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 ovn-installed in OVS
Jan 20 09:57:31 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:31Z|00619|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 up in Southbound
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.135 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 systemd-udevd[274999]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.154 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fc169889-b4d5-43c3-bee7-f22ad49b7eef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 systemd-machined[196121]: New machine qemu-62-instance-00000082.
Jan 20 09:57:31 np0005588920 NetworkManager[49076]: <info>  [1768921051.1685] device (tap0293f4ad-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:57:31 np0005588920 NetworkManager[49076]: <info>  [1768921051.1700] device (tap0293f4ad-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:57:31 np0005588920 systemd[1]: Started Virtual Machine qemu-62-instance-00000082.
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.185 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a907b2-5589-4c73-9506-511c01df2d68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.189 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[36eb2fbe-4dd7-4a60-9ae5-a9e9fd00b997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.218 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e7520d37-87e5-4567-b33c-b593aab95b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.233 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd288e0-2948-4600-9635-63b31bd78fb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275011, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.248 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7b284b21-9121-46c4-84c3-421e54113e5c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585752, 'tstamp': 585752}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275013, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585755, 'tstamp': 585755}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275013, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.249 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.251 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.252 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.252 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.252 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:31.253 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.265 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:31.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.783 226890 DEBUG nova.compute.manager [req-32af0cd6-0095-459d-915f-3a13ba90f906 req-923ae16b-6ef8-420a-9e4b-4a63febf6b4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.783 226890 DEBUG oslo_concurrency.lockutils [req-32af0cd6-0095-459d-915f-3a13ba90f906 req-923ae16b-6ef8-420a-9e4b-4a63febf6b4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.784 226890 DEBUG oslo_concurrency.lockutils [req-32af0cd6-0095-459d-915f-3a13ba90f906 req-923ae16b-6ef8-420a-9e4b-4a63febf6b4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.784 226890 DEBUG oslo_concurrency.lockutils [req-32af0cd6-0095-459d-915f-3a13ba90f906 req-923ae16b-6ef8-420a-9e4b-4a63febf6b4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:31 np0005588920 nova_compute[226886]: 2026-01-20 14:57:31.784 226890 DEBUG nova.compute.manager [req-32af0cd6-0095-459d-915f-3a13ba90f906 req-923ae16b-6ef8-420a-9e4b-4a63febf6b4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Processing event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.066 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921052.0659683, cf24bde1-0912-4d63-8959-6799ae8ab043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.067 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Started (Lifecycle Event)#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.069 226890 DEBUG nova.compute.manager [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.073 226890 DEBUG nova.virt.libvirt.driver [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.076 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance spawned successfully.#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.096 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.099 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.118 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.119 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921052.0661862, cf24bde1-0912-4d63-8959-6799ae8ab043 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.119 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.142 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.146 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921052.071933, cf24bde1-0912-4d63-8959-6799ae8ab043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.146 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.170 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.176 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.209 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:32.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.812 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.812 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.838 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.960 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.960 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.983 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:57:32 np0005588920 nova_compute[226886]: 2026-01-20 14:57:32.983 226890 INFO nova.compute.claims [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.148 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:33.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3524889482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.603 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.608 226890 DEBUG nova.compute.provider_tree [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.625 226890 DEBUG nova.scheduler.client.report [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.650 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.651 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.703 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.704 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.726 226890 INFO nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.743 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.785 226890 DEBUG nova.compute.manager [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.795 226890 INFO nova.virt.block_device [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Booting with volume 94858b1a-370f-4ee8-b017-647fe5082382 at /dev/vda#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.857 226890 DEBUG oslo_concurrency.lockutils [None req-abf6f964-8975-43dc-a29a-bc9b7b524cc3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.911 226890 DEBUG nova.compute.manager [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.911 226890 DEBUG oslo_concurrency.lockutils [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.911 226890 DEBUG oslo_concurrency.lockutils [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.912 226890 DEBUG oslo_concurrency.lockutils [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.912 226890 DEBUG nova.compute.manager [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:33 np0005588920 nova_compute[226886]: 2026-01-20 14:57:33.912 226890 WARNING nova.compute.manager [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received unexpected event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.023 226890 DEBUG os_brick.utils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.025 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.053 226890 DEBUG nova.policy [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ed2c9bd268d1491fa3484d86bcdb9ec6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:57:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:34.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.853 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.828s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.853 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[ab08ffeb-bcd6-417c-93c8-4ac086211d47]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.855 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:57:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.863 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.864 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[c41d7ccb-3979-4b86-bdb3-20e6e84ca9b9]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.865 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.874 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.874 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[7aff089a-6879-47b6-ad36-448ba29c1c33]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.876 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d65e57-e068-455f-bf28-f81bf211e432]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.877 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.903 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.907 226890 DEBUG os_brick.initiator.connectors.lightos [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.908 226890 DEBUG os_brick.initiator.connectors.lightos [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.908 226890 DEBUG os_brick.initiator.connectors.lightos [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.909 226890 DEBUG os_brick.utils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (884ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:57:34 np0005588920 nova_compute[226886]: 2026-01-20 14:57:34.909 226890 DEBUG nova.virt.block_device [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating existing volume attachment record: 86aad614-44a2-43a7-a5c0-6a1d947269d9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:57:34 np0005588920 podman[275216]: 2026-01-20 14:57:34.976935177 +0000 UTC m=+0.065096455 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 09:57:35 np0005588920 nova_compute[226886]: 2026-01-20 14:57:35.149 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:35.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.277 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.279 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.280 226890 INFO nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Creating image(s)#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.280 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.280 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Ensure instance console log exists: /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.281 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.281 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:36 np0005588920 nova_compute[226886]: 2026-01-20 14:57:36.282 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:37 np0005588920 nova_compute[226886]: 2026-01-20 14:57:37.069 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Successfully created port: c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:57:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:37.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:37 np0005588920 nova_compute[226886]: 2026-01-20 14:57:37.538 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:37.538 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:37.540 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:57:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 20 09:57:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:38.542 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:38.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1873084837' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:39.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:39 np0005588920 nova_compute[226886]: 2026-01-20 14:57:39.709 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Successfully updated port: c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:57:39 np0005588920 nova_compute[226886]: 2026-01-20 14:57:39.730 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:39 np0005588920 nova_compute[226886]: 2026-01-20 14:57:39.731 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:39 np0005588920 nova_compute[226886]: 2026-01-20 14:57:39.731 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:57:39 np0005588920 nova_compute[226886]: 2026-01-20 14:57:39.958 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:39.988120) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921059988230, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 1600, "num_deletes": 256, "total_data_size": 3286196, "memory_usage": 3359552, "flush_reason": "Manual Compaction"}
Jan 20 09:57:39 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060010721, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 2143187, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 51007, "largest_seqno": 52602, "table_properties": {"data_size": 2136383, "index_size": 3811, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15753, "raw_average_key_size": 20, "raw_value_size": 2122188, "raw_average_value_size": 2822, "num_data_blocks": 166, "num_entries": 752, "num_filter_entries": 752, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920955, "oldest_key_time": 1768920955, "file_creation_time": 1768921059, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 22640 microseconds, and 5278 cpu microseconds.
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.010767) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 2143187 bytes OK
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.010785) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.012439) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.012456) EVENT_LOG_v1 {"time_micros": 1768921060012450, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.012473) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 3278640, prev total WAL file size 3278640, number of live WAL files 2.
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.013448) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(2092KB)], [99(9849KB)]
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060013506, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 12228972, "oldest_snapshot_seqno": -1}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7778 keys, 10351051 bytes, temperature: kUnknown
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060125506, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 10351051, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10300239, "index_size": 30284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19461, "raw_key_size": 201283, "raw_average_key_size": 25, "raw_value_size": 10162549, "raw_average_value_size": 1306, "num_data_blocks": 1188, "num_entries": 7778, "num_filter_entries": 7778, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921060, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.125856) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 10351051 bytes
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.127529) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.1 rd, 92.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.6 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(10.5) write-amplify(4.8) OK, records in: 8311, records dropped: 533 output_compression: NoCompression
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.127550) EVENT_LOG_v1 {"time_micros": 1768921060127540, "job": 62, "event": "compaction_finished", "compaction_time_micros": 112131, "compaction_time_cpu_micros": 24419, "output_level": 6, "num_output_files": 1, "total_output_size": 10351051, "num_input_records": 8311, "num_output_records": 7778, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060128091, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060130723, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.013356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.130832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.130839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.130841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.130843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-14:57:40.130845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.368 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.368 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.369 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.369 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.369 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.371 226890 INFO nova.compute.manager [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Terminating instance#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.372 226890 DEBUG nova.compute.manager [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:57:40 np0005588920 kernel: tap0293f4ad-12 (unregistering): left promiscuous mode
Jan 20 09:57:40 np0005588920 NetworkManager[49076]: <info>  [1768921060.4080] device (tap0293f4ad-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:57:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:40Z|00620|binding|INFO|Releasing lport 0293f4ad-1248-4899-81ef-32e616d9a754 from this chassis (sb_readonly=0)
Jan 20 09:57:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:40Z|00621|binding|INFO|Setting lport 0293f4ad-1248-4899-81ef-32e616d9a754 down in Southbound
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.417 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:40Z|00622|binding|INFO|Removing iface tap0293f4ad-12 ovn-installed in OVS
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.419 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.424 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:aa:73 10.100.0.12'], port_security=['fa:16:3e:a8:aa:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cf24bde1-0912-4d63-8959-6799ae8ab043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '9', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0293f4ad-1248-4899-81ef-32e616d9a754) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.425 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0293f4ad-1248-4899-81ef-32e616d9a754 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce unbound from our chassis#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.427 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.433 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.443 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d69fa2f7-a346-40c4-a2c3-f90d14359082]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.474 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[45cdef11-2667-4025-b048-cd024dc61af7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.477 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[310afb5f-4443-4b12-873b-9d0f125d63e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000082.scope: Consumed 9.501s CPU time.
Jan 20 09:57:40 np0005588920 systemd-machined[196121]: Machine qemu-62-instance-00000082 terminated.
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.503 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[243e680a-05ee-4364-a4e5-30faa1466201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.519 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82a42666-d6de-4a27-b5fd-3bcd3d1cc101]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585739, 'reachable_time': 32128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275297, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.536 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[47bc5a9d-9b7d-4fb4-ae7a-11af049ed59e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585752, 'tstamp': 585752}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275298, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap41a1a3fe-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585755, 'tstamp': 585755}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275298, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.538 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.539 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.543 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.543 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.544 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.544 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:40.544 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.603 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.609 226890 INFO nova.virt.libvirt.driver [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance destroyed successfully.#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.610 226890 DEBUG nova.objects.instance [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid cf24bde1-0912-4d63-8959-6799ae8ab043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.632 226890 DEBUG nova.virt.libvirt.vif [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:56:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1938504336',display_name='tempest-ServerActionsTestOtherB-server-1938504336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1938504336',id=130,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:57:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-n1l51qfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=cf24bde1-0912-4d63-8959-6799ae8ab043,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.632 226890 DEBUG nova.network.os_vif_util [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "0293f4ad-1248-4899-81ef-32e616d9a754", "address": "fa:16:3e:a8:aa:73", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0293f4ad-12", "ovs_interfaceid": "0293f4ad-1248-4899-81ef-32e616d9a754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.633 226890 DEBUG nova.network.os_vif_util [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.633 226890 DEBUG os_vif [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.635 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0293f4ad-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.636 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.640 226890 INFO os_vif [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:aa:73,bridge_name='br-int',has_traffic_filtering=True,id=0293f4ad-1248-4899-81ef-32e616d9a754,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0293f4ad-12')#033[00m
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:40 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:40Z|00623|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:57:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:40.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:40 np0005588920 nova_compute[226886]: 2026-01-20 14:57:40.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.073 226890 DEBUG nova.network.neutron [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.128 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.129 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Instance network_info: |[{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.131 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Start _get_guest_xml network_info=[{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-94858b1a-370f-4ee8-b017-647fe5082382', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '94858b1a-370f-4ee8-b017-647fe5082382', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f', 'attached_at': '', 'detached_at': '', 'volume_id': '94858b1a-370f-4ee8-b017-647fe5082382', 'serial': '94858b1a-370f-4ee8-b017-647fe5082382'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '86aad614-44a2-43a7-a5c0-6a1d947269d9', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.135 226890 WARNING nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.140 226890 DEBUG nova.virt.libvirt.host [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.141 226890 DEBUG nova.virt.libvirt.host [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.144 226890 DEBUG nova.virt.libvirt.host [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.145 226890 DEBUG nova.virt.libvirt.host [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.146 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.146 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.147 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.147 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.147 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.147 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.148 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.148 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.148 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.148 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.148 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.149 226890 DEBUG nova.virt.hardware [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.176 226890 DEBUG nova.storage.rbd_utils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.183 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.248 226890 INFO nova.virt.libvirt.driver [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deleting instance files /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043_del#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.249 226890 INFO nova.virt.libvirt.driver [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deletion of /var/lib/nova/instances/cf24bde1-0912-4d63-8959-6799ae8ab043_del complete#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.335 226890 INFO nova.compute.manager [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.336 226890 DEBUG oslo.service.loopingcall [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.336 226890 DEBUG nova.compute.manager [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.337 226890 DEBUG nova.network.neutron [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.367 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.367 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.368 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.368 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.368 226890 DEBUG nova.network.neutron [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:41.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2972576168' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.630 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.659 226890 DEBUG nova.virt.libvirt.vif [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-95666363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-95666363',id=138,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-1fx4exun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:33Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.660 226890 DEBUG nova.network.os_vif_util [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.661 226890 DEBUG nova.network.os_vif_util [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.664 226890 DEBUG nova.objects.instance [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'pci_devices' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.677 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <uuid>a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f</uuid>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <name>instance-0000008a</name>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-95666363</nova:name>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:57:41</nova:creationTime>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:user uuid="ed2c9bd268d1491fa3484d86bcdb9ec6">tempest-TestInstancesWithCinderVolumes-1174033615-project-member</nova:user>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:project uuid="107c1f3b5b7b413d9a389ca1166e331f">tempest-TestInstancesWithCinderVolumes-1174033615</nova:project>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <nova:port uuid="c0ac6308-ae73-4b17-95fa-47f3df3c4f97">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="serial">a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="uuid">a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-94858b1a-370f-4ee8-b017-647fe5082382">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <serial>94858b1a-370f-4ee8-b017-647fe5082382</serial>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:d8:23:95"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <target dev="tapc0ac6308-ae"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/console.log" append="off"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:57:41 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:57:41 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:57:41 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:57:41 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.677 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Preparing to wait for external event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.678 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.678 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.678 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.679 226890 DEBUG nova.virt.libvirt.vif [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-95666363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-95666363',id=138,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-1fx4exun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:33Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.679 226890 DEBUG nova.network.os_vif_util [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.679 226890 DEBUG nova.network.os_vif_util [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.680 226890 DEBUG os_vif [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.680 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.680 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.681 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.683 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.683 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0ac6308-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.683 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0ac6308-ae, col_values=(('external_ids', {'iface-id': 'c0ac6308-ae73-4b17-95fa-47f3df3c4f97', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:23:95', 'vm-uuid': 'a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.719 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 NetworkManager[49076]: <info>  [1768921061.7206] manager: (tapc0ac6308-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.723 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.724 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.724 226890 INFO os_vif [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae')#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.805 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.806 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.806 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d8:23:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.806 226890 INFO nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Using config drive#033[00m
Jan 20 09:57:41 np0005588920 nova_compute[226886]: 2026-01-20 14:57:41.831 226890 DEBUG nova.storage.rbd_utils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.446 226890 INFO nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Creating config drive at /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.457 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeh4zycut execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.596 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeh4zycut" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.638 226890 DEBUG nova.storage.rbd_utils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.643 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.817 226890 DEBUG nova.network.neutron [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.971 226890 INFO nova.compute.manager [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Took 1.63 seconds to deallocate network for instance.#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.976 226890 DEBUG oslo_concurrency.processutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:42 np0005588920 nova_compute[226886]: 2026-01-20 14:57:42.977 226890 INFO nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Deleting local config drive /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f/disk.config because it was imported into RBD.#033[00m
Jan 20 09:57:43 np0005588920 kernel: tapc0ac6308-ae: entered promiscuous mode
Jan 20 09:57:43 np0005588920 systemd-udevd[275289]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.0254] manager: (tapc0ac6308-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.025 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:43Z|00624|binding|INFO|Claiming lport c0ac6308-ae73-4b17-95fa-47f3df3c4f97 for this chassis.
Jan 20 09:57:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:43Z|00625|binding|INFO|c0ac6308-ae73-4b17-95fa-47f3df3c4f97: Claiming fa:16:3e:d8:23:95 10.100.0.6
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.0367] device (tapc0ac6308-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.036 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:23:95 10.100.0.6'], port_security=['fa:16:3e:d8:23:95 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c0ac6308-ae73-4b17-95fa-47f3df3c4f97) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.0377] device (tapc0ac6308-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.037 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 bound to our chassis#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.039 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d966e1-4d26-414a-920e-0be2d77abb59#033[00m
Jan 20 09:57:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:43Z|00626|binding|INFO|Setting lport c0ac6308-ae73-4b17-95fa-47f3df3c4f97 ovn-installed in OVS
Jan 20 09:57:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:43Z|00627|binding|INFO|Setting lport c0ac6308-ae73-4b17-95fa-47f3df3c4f97 up in Southbound
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.049 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.050 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.050 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.053 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cbbcb3fd-06a1-4915-9535-5a162907ce95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.054 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58d966e1-41 in ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.056 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58d966e1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.056 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[14776075-e9f5-4291-9187-8b6c9f37dc66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.058 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[27774a71-dcf9-4a20-88df-c0db83d50d41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 systemd-machined[196121]: New machine qemu-63-instance-0000008a.
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.068 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[60939438-46b2-4f12-8b4e-c1ba023bdff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 systemd[1]: Started Virtual Machine qemu-63-instance-0000008a.
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.093 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3350c624-eed6-4e96-ada4-f60147da526a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.123 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2aebe03a-67aa-4aa4-a9b0-9560288fa3d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.1341] manager: (tap58d966e1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.132 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[850d4bf1-ead1-4cec-9c5a-6364eca884fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.170 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c82252-bb71-4966-81cf-ed55a1c44e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.172 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[eebe4d09-ff2a-475b-8f53-83bc270be32a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.1968] device (tap58d966e1-40): carrier: link connected
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.203 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6c16a8-a2e0-4573-b56a-527becb776e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.218 226890 DEBUG oslo_concurrency.processutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.219 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1724a7-95a9-4a6b-9451-208d096d0659]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610488, 'reachable_time': 33355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275474, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.243 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b53f9944-d93f-405e-acce-a17c8aaf1194]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:c82a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610488, 'tstamp': 610488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275476, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.259 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[41b2cbb7-e1fa-46b0-9a5e-5e055d31b48e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610488, 'reachable_time': 33355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275477, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.292 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fc510e0f-d8a2-4208-a28f-2ae8e998dcd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.349 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c78d9a0d-9b29-4261-a93c-890a3b1e7da3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.351 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.351 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.352 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d966e1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588920 NetworkManager[49076]: <info>  [1768921063.3545] manager: (tap58d966e1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588920 kernel: tap58d966e1-40: entered promiscuous mode
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.357 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d966e1-40, col_values=(('external_ids', {'iface-id': '1623097d-35b0-4d71-9dc2-c4d659492102'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:43Z|00628|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.376 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.377 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6e910628-3bb6-4b1d-9dc1-1f09563a8aad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.377 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-58d966e1-4d26-414a-920e-0be2d77abb59
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 58d966e1-4d26-414a-920e-0be2d77abb59
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:57:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:43.379 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'env', 'PROCESS_TAG=haproxy-58d966e1-4d26-414a-920e-0be2d77abb59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58d966e1-4d26-414a-920e-0be2d77abb59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.377 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:43.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.458 226890 DEBUG nova.network.neutron [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.459 226890 DEBUG nova.network.neutron [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.480 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.481 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.481 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.482 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.482 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.482 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.482 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-unplugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.483 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.483 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.483 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.483 226890 DEBUG oslo_concurrency.lockutils [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.484 226890 DEBUG nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] No waiting events found dispatching network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.484 226890 WARNING nova.compute.manager [req-c0c5229d-1ab9-4583-9ba8-79fe11768edd req-150e2247-905f-46b8-b55f-ff0d4bc22ca3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received unexpected event network-vif-plugged-0293f4ad-1248-4899-81ef-32e616d9a754 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.502 226890 DEBUG nova.compute.manager [req-9499b9ef-0a22-4ac2-af55-764cdd89a45c req-995d2167-b391-4a12-ac88-28721a7ac7f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Received event network-vif-deleted-0293f4ad-1248-4899-81ef-32e616d9a754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.541 226890 DEBUG nova.compute.manager [req-9788d130-e67b-40bc-9e4d-91b9bff4754f req-ba641286-7fe0-4984-baff-0ee707249432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.541 226890 DEBUG oslo_concurrency.lockutils [req-9788d130-e67b-40bc-9e4d-91b9bff4754f req-ba641286-7fe0-4984-baff-0ee707249432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.542 226890 DEBUG oslo_concurrency.lockutils [req-9788d130-e67b-40bc-9e4d-91b9bff4754f req-ba641286-7fe0-4984-baff-0ee707249432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.542 226890 DEBUG oslo_concurrency.lockutils [req-9788d130-e67b-40bc-9e4d-91b9bff4754f req-ba641286-7fe0-4984-baff-0ee707249432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.542 226890 DEBUG nova.compute.manager [req-9788d130-e67b-40bc-9e4d-91b9bff4754f req-ba641286-7fe0-4984-baff-0ee707249432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Processing event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:57:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2874127425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.687 226890 DEBUG oslo_concurrency.processutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.695 226890 DEBUG nova.compute.provider_tree [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.713 226890 DEBUG nova.scheduler.client.report [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.734 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.766 226890 INFO nova.scheduler.client.report [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Deleted allocations for instance cf24bde1-0912-4d63-8959-6799ae8ab043#033[00m
Jan 20 09:57:43 np0005588920 podman[275529]: 2026-01-20 14:57:43.785984488 +0000 UTC m=+0.060027132 container create 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:57:43 np0005588920 systemd[1]: Started libpod-conmon-61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7.scope.
Jan 20 09:57:43 np0005588920 podman[275529]: 2026-01-20 14:57:43.752009951 +0000 UTC m=+0.026052615 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.861 226890 DEBUG oslo_concurrency.lockutils [None req-8d851848-d4d9-4fcf-b88a-62a37c25fc9c 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "cf24bde1-0912-4d63-8959-6799ae8ab043" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:57:43 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a52d1ebc81a53cb7bde009370b81e65f5cbc9d610279afd6d55f011114b6cc21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:57:43 np0005588920 podman[275529]: 2026-01-20 14:57:43.891639534 +0000 UTC m=+0.165682198 container init 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:57:43 np0005588920 podman[275529]: 2026-01-20 14:57:43.901922284 +0000 UTC m=+0.175964928 container start 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:57:43 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [NOTICE]   (275548) : New worker (275550) forked
Jan 20 09:57:43 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [NOTICE]   (275548) : Loading success.
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.960 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.960 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.960 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:57:43 np0005588920 nova_compute[226886]: 2026-01-20 14:57:43.979 226890 DEBUG nova.compute.utils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.154 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.184 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.185 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921064.184022, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.186 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] VM Started (Lifecycle Event)#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.188 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.190 226890 INFO nova.virt.libvirt.driver [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Instance spawned successfully.#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.190 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.209 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.211 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.218 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.219 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.219 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.219 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.219 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.220 226890 DEBUG nova.virt.libvirt.driver [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.258 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.258 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921064.185208, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.259 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.285 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.288 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921064.1873212, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.288 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.377 226890 INFO nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Took 8.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.378 226890 DEBUG nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.386 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.388 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.577 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.636 226890 INFO nova.compute.manager [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Took 11.72 seconds to build instance.#033[00m
Jan 20 09:57:44 np0005588920 nova_compute[226886]: 2026-01-20 14:57:44.657 226890 DEBUG oslo_concurrency.lockutils [None req-1abb3938-366f-4c48-ae74-92acbf471b68 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:44.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.013 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.054 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-cf24bde1-0912-4d63-8959-6799ae8ab043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.054 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:57:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:57:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/607676245' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:57:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:57:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/607676245' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:57:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:45.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.673 226890 DEBUG nova.compute.manager [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.673 226890 DEBUG oslo_concurrency.lockutils [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.673 226890 DEBUG oslo_concurrency.lockutils [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.674 226890 DEBUG oslo_concurrency.lockutils [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.674 226890 DEBUG nova.compute.manager [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] No waiting events found dispatching network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.674 226890 WARNING nova.compute.manager [req-dc91b8b2-5d3a-48c2-b1ab-75df75a81c6d req-508e028c-521d-4ff1-b647-609e3f6b57e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received unexpected event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.751 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.752 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.753 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.853 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.854 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.854 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.854 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.854 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.856 226890 INFO nova.compute.manager [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Terminating instance#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.856 226890 DEBUG nova.compute.manager [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:57:45 np0005588920 kernel: tap083e3cc0-e6 (unregistering): left promiscuous mode
Jan 20 09:57:45 np0005588920 NetworkManager[49076]: <info>  [1768921065.9124] device (tap083e3cc0-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:57:45 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:45Z|00629|binding|INFO|Releasing lport 083e3cc0-e665-4049-a47b-233abf07b9d5 from this chassis (sb_readonly=0)
Jan 20 09:57:45 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:45Z|00630|binding|INFO|Setting lport 083e3cc0-e665-4049-a47b-233abf07b9d5 down in Southbound
Jan 20 09:57:45 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:45Z|00631|binding|INFO|Removing iface tap083e3cc0-e6 ovn-installed in OVS
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.922 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:45.932 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:15:6d 10.100.0.5'], port_security=['fa:16:3e:6a:15:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ce0152a6-7d4d-4eac-9587-a43ad934d9cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '800ce09e-d4c4-4be1-b862-b09f6926701e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=083e3cc0-e665-4049-a47b-233abf07b9d5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:45.933 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 083e3cc0-e665-4049-a47b-233abf07b9d5 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce unbound from our chassis#033[00m
Jan 20 09:57:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:45.935 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:57:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:45.937 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b67ec9-55a1-402d-be31-27b701460928]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:45.937 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce namespace which is not needed anymore#033[00m
Jan 20 09:57:45 np0005588920 nova_compute[226886]: 2026-01-20 14:57:45.947 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:45 np0005588920 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 20 09:57:45 np0005588920 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Consumed 23.006s CPU time.
Jan 20 09:57:45 np0005588920 systemd-machined[196121]: Machine qemu-56-instance-0000007a terminated.
Jan 20 09:57:46 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [NOTICE]   (270332) : haproxy version is 2.8.14-c23fe91
Jan 20 09:57:46 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [NOTICE]   (270332) : path to executable is /usr/sbin/haproxy
Jan 20 09:57:46 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [WARNING]  (270332) : Exiting Master process...
Jan 20 09:57:46 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [ALERT]    (270332) : Current worker (270334) exited with code 143 (Terminated)
Jan 20 09:57:46 np0005588920 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[270328]: [WARNING]  (270332) : All workers exited. Exiting... (0)
Jan 20 09:57:46 np0005588920 systemd[1]: libpod-303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8.scope: Deactivated successfully.
Jan 20 09:57:46 np0005588920 conmon[270328]: conmon 303ceca778e566248256 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8.scope/container/memory.events
Jan 20 09:57:46 np0005588920 podman[275642]: 2026-01-20 14:57:46.073852392 +0000 UTC m=+0.048736304 container died 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.097 226890 INFO nova.virt.libvirt.driver [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Instance destroyed successfully.#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.099 226890 DEBUG nova.objects.instance [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid ce0152a6-7d4d-4eac-9587-a43ad934d9cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:46 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8-userdata-shm.mount: Deactivated successfully.
Jan 20 09:57:46 np0005588920 systemd[1]: var-lib-containers-storage-overlay-33682d6e7c63feaddac6a82c3caf43b99746288732ae100d8bacf369bba08d8a-merged.mount: Deactivated successfully.
Jan 20 09:57:46 np0005588920 podman[275642]: 2026-01-20 14:57:46.12278345 +0000 UTC m=+0.097667322 container cleanup 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.127 226890 DEBUG nova.virt.libvirt.vif [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:53:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1580304238',display_name='tempest-ServerActionsTestOtherB-server-1580304238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1580304238',id=122,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-932d0bwg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:53:36Z,user_data=None,user_id='215db37373dc4ae5a75cbd6866f471da',uuid=ce0152a6-7d4d-4eac-9587-a43ad934d9cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.127 226890 DEBUG nova.network.os_vif_util [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "083e3cc0-e665-4049-a47b-233abf07b9d5", "address": "fa:16:3e:6a:15:6d", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap083e3cc0-e6", "ovs_interfaceid": "083e3cc0-e665-4049-a47b-233abf07b9d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.128 226890 DEBUG nova.network.os_vif_util [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.129 226890 DEBUG os_vif [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.131 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap083e3cc0-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:46 np0005588920 systemd[1]: libpod-conmon-303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8.scope: Deactivated successfully.
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.153 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.155 226890 INFO os_vif [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:15:6d,bridge_name='br-int',has_traffic_filtering=True,id=083e3cc0-e665-4049-a47b-233abf07b9d5,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap083e3cc0-e6')#033[00m
Jan 20 09:57:46 np0005588920 podman[275684]: 2026-01-20 14:57:46.192544495 +0000 UTC m=+0.046257794 container remove 303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.197 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[276a67e6-9de8-41bc-a85c-65dc31950ae4]: (4, ('Tue Jan 20 02:57:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce (303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8)\n303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8\nTue Jan 20 02:57:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce (303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8)\n303ceca778e56624825691efee2a35e048a5758452b5324748fe3f3234f98eb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.199 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[313dbb88-8cc6-4aec-94a7-392e6dad04d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.200 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 kernel: tap41a1a3fe-f0: left promiscuous mode
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.205 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3342654723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a88536-d5ca-4f83-903c-6f5a25d6f4fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.217 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[52902c09-1955-4921-940b-84f7353bd8e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.219 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[853b985f-7d38-46fc-a313-b3fac7feadfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.223 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.233 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ef63906f-8898-4338-bb31-f26fb52ce218]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585732, 'reachable_time': 31113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275716, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.239 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.240 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:57:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:57:46.240 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[f91ad8ac-98cf-4ec3-9985-7614dfe52075]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:46 np0005588920 systemd[1]: run-netns-ovnmeta\x2d41a1a3fe\x2df6f8\x2d4375\x2d9b0f\x2da4d4bb269cce.mount: Deactivated successfully.
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.332 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.333 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.335 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.335 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.507 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.508 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4198MB free_disk=20.83990478515625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.509 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.509 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.587 226890 INFO nova.virt.libvirt.driver [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Deleting instance files /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc_del#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.588 226890 INFO nova.virt.libvirt.driver [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Deletion of /var/lib/nova/instances/ce0152a6-7d4d-4eac-9587-a43ad934d9cc_del complete#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.660 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.660 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.660 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.661 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.666 226890 INFO nova.compute.manager [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.666 226890 DEBUG oslo.service.loopingcall [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.667 226890 DEBUG nova.compute.manager [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.667 226890 DEBUG nova.network.neutron [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:57:46 np0005588920 nova_compute[226886]: 2026-01-20 14:57:46.699 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:46.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:47 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1505514556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.187 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.194 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.266 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.315 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.316 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:47.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.447 226890 DEBUG nova.network.neutron [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.515 226890 INFO nova.compute.manager [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Took 0.85 seconds to deallocate network for instance.#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.619 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.620 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.744 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.745 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.749 226890 DEBUG nova.objects.instance [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.828 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:47 np0005588920 nova_compute[226886]: 2026-01-20 14:57:47.840 226890 DEBUG oslo_concurrency.processutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4255159192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.308 226890 DEBUG oslo_concurrency.processutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.315 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.315 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.316 226890 INFO nova.compute.manager [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attaching volume 06b5ccf9-bd72-4532-983a-75b3b42cbfe9 to /dev/vdb#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.322 226890 DEBUG nova.compute.provider_tree [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.349 226890 DEBUG nova.scheduler.client.report [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.397 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.476 226890 INFO nova.scheduler.client.report [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Deleted allocations for instance ce0152a6-7d4d-4eac-9587-a43ad934d9cc#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.650 226890 DEBUG nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-vif-unplugged-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.652 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.652 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.652 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.652 226890 DEBUG nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] No waiting events found dispatching network-vif-unplugged-083e3cc0-e665-4049-a47b-233abf07b9d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.653 226890 WARNING nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received unexpected event network-vif-unplugged-083e3cc0-e665-4049-a47b-233abf07b9d5 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.653 226890 DEBUG nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.653 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.653 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.654 226890 DEBUG oslo_concurrency.lockutils [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.654 226890 DEBUG nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] No waiting events found dispatching network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.654 226890 WARNING nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received unexpected event network-vif-plugged-083e3cc0-e665-4049-a47b-233abf07b9d5 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.654 226890 DEBUG nova.compute.manager [req-1350f10c-1b0b-4f1d-a345-3370b069d7a1 req-996e6784-114d-4411-926e-652d81a75f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Received event network-vif-deleted-083e3cc0-e665-4049-a47b-233abf07b9d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.722 226890 DEBUG oslo_concurrency.lockutils [None req-73ebe117-101a-4304-9249-08037d3181f3 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "ce0152a6-7d4d-4eac-9587-a43ad934d9cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.832 226890 DEBUG os_brick.utils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.834 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.851 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.851 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6a71c758-5882-440c-baa1-75ebf4652544]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.853 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.862 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.863 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[4859afad-bce9-45a7-9889-7fb4ae392647]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.865 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:48.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.875 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.876 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f7a6d3-3cbd-4aec-aa09-6aa95dd4c767]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.879 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[a78492ec-b68e-40b5-9c9f-8cd9d3f61053]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.880 226890 DEBUG oslo_concurrency.processutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.922 226890 DEBUG oslo_concurrency.processutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.926 226890 DEBUG os_brick.initiator.connectors.lightos [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.927 226890 DEBUG os_brick.initiator.connectors.lightos [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.928 226890 DEBUG os_brick.initiator.connectors.lightos [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.929 226890 DEBUG os_brick.utils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (95ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:57:48 np0005588920 nova_compute[226886]: 2026-01-20 14:57:48.929 226890 DEBUG nova.virt.block_device [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating existing volume attachment record: 2c0a4684-4e14-4aaa-82b1-ba92b18dafde _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:57:49 np0005588920 nova_compute[226886]: 2026-01-20 14:57:49.312 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:49 np0005588920 nova_compute[226886]: 2026-01-20 14:57:49.314 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:49 np0005588920 nova_compute[226886]: 2026-01-20 14:57:49.314 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:49 np0005588920 nova_compute[226886]: 2026-01-20 14:57:49.315 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:49.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/319160311' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.519 226890 DEBUG nova.objects.instance [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.626 226890 DEBUG nova.virt.libvirt.driver [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attempting to attach volume 06b5ccf9-bd72-4532-983a-75b3b42cbfe9 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.629 226890 DEBUG nova.virt.libvirt.guest [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-06b5ccf9-bd72-4532-983a-75b3b42cbfe9">
Jan 20 09:57:50 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 09:57:50 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  </auth>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:57:50 np0005588920 nova_compute[226886]:  <serial>06b5ccf9-bd72-4532-983a-75b3b42cbfe9</serial>
Jan 20 09:57:50 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:57:50 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.870 226890 DEBUG nova.virt.libvirt.driver [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.870 226890 DEBUG nova.virt.libvirt.driver [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.871 226890 DEBUG nova.virt.libvirt.driver [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:50 np0005588920 nova_compute[226886]: 2026-01-20 14:57:50.871 226890 DEBUG nova.virt.libvirt.driver [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d8:23:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:51 np0005588920 nova_compute[226886]: 2026-01-20 14:57:51.152 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:51 np0005588920 nova_compute[226886]: 2026-01-20 14:57:51.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:51.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:51 np0005588920 nova_compute[226886]: 2026-01-20 14:57:51.520 226890 DEBUG oslo_concurrency.lockutils [None req-36f487dc-db76-4e2c-b521-f3f0bb5c5364 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:51 np0005588920 nova_compute[226886]: 2026-01-20 14:57:51.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:57:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 10K writes, 52K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 10K writes, 10K syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1853 writes, 9380 keys, 1853 commit groups, 1.0 writes per commit group, ingest: 17.64 MB, 0.03 MB/s#012Interval WAL: 1853 writes, 1853 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     76.5      0.82              0.25        31    0.026       0      0       0.0       0.0#012  L6      1/0    9.87 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.4    111.4     93.4      2.94              0.91        30    0.098    183K    16K       0.0       0.0#012 Sum      1/0    9.87 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.4     87.1     89.7      3.75              1.15        61    0.062    183K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.5     91.9     92.6      0.91              0.21        14    0.065     54K   3667       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    111.4     93.4      2.94              0.91        30    0.098    183K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     76.7      0.82              0.25        30    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.061, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.33 GB write, 0.09 MB/s write, 0.32 GB read, 0.09 MB/s read, 3.8 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 38.24 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000258 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2196,36.88 MB,12.1302%) FilterBlock(61,519.36 KB,0.166838%) IndexBlock(61,877.77 KB,0.281971%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:57:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:52.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:53.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:53 np0005588920 nova_compute[226886]: 2026-01-20 14:57:53.849 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:53 np0005588920 nova_compute[226886]: 2026-01-20 14:57:53.850 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:53 np0005588920 nova_compute[226886]: 2026-01-20 14:57:53.888 226890 DEBUG nova.objects.instance [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:53 np0005588920 nova_compute[226886]: 2026-01-20 14:57:53.959 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:54 np0005588920 nova_compute[226886]: 2026-01-20 14:57:54.407 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:54 np0005588920 nova_compute[226886]: 2026-01-20 14:57:54.615 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:54 np0005588920 nova_compute[226886]: 2026-01-20 14:57:54.615 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:54 np0005588920 nova_compute[226886]: 2026-01-20 14:57:54.616 226890 INFO nova.compute.manager [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attaching volume 0668acc6-1fde-4609-addf-37a73f135900 to /dev/vdc#033[00m
Jan 20 09:57:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:54.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.052 226890 DEBUG os_brick.utils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.054 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.073 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.074 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2bd421-f1fa-4d9e-b479-39c22a165b90]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.075 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.089 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.089 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[118a109a-8bea-4ec0-b5bf-9a62375785b7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.091 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.105 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.105 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[9047d594-217e-4347-9baf-ae43c4e80f65]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.111 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[a64e2b14-f30e-44aa-99f5-801f13a32db7]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.112 226890 DEBUG oslo_concurrency.processutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.157 226890 DEBUG oslo_concurrency.processutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.162 226890 DEBUG os_brick.initiator.connectors.lightos [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.162 226890 DEBUG os_brick.initiator.connectors.lightos [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.163 226890 DEBUG os_brick.initiator.connectors.lightos [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.163 226890 DEBUG os_brick.utils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (109ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.164 226890 DEBUG nova.virt.block_device [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating existing volume attachment record: b60b56a6-2e45-4be4-9be9-aed60a3c8da7 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:57:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:55.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.608 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921060.6073518, cf24bde1-0912-4d63-8959-6799ae8ab043 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.608 226890 INFO nova.compute.manager [-] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:57:55 np0005588920 nova_compute[226886]: 2026-01-20 14:57:55.635 226890 DEBUG nova.compute.manager [None req-8b75cf82-58dc-411b-888e-504d091a7ec5 - - - - - -] [instance: cf24bde1-0912-4d63-8959-6799ae8ab043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.339 226890 DEBUG nova.objects.instance [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.387 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attempting to attach volume 0668acc6-1fde-4609-addf-37a73f135900 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.389 226890 DEBUG nova.virt.libvirt.guest [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-0668acc6-1fde-4609-addf-37a73f135900">
Jan 20 09:57:56 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 09:57:56 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  </auth>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:57:56 np0005588920 nova_compute[226886]:  <serial>0668acc6-1fde-4609-addf-37a73f135900</serial>
Jan 20 09:57:56 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:57:56 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.506 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.507 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.508 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.508 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.508 226890 DEBUG nova.virt.libvirt.driver [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d8:23:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:56.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:56 np0005588920 nova_compute[226886]: 2026-01-20 14:57:56.947 226890 DEBUG oslo_concurrency.lockutils [None req-d371a49b-0907-4112-8c21-96a6a825787b ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:57 np0005588920 podman[275817]: 2026-01-20 14:57:57.038615944 +0000 UTC m=+0.122501361 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:57:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:57.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:58Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:23:95 10.100.0.6
Jan 20 09:57:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:57:58Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:23:95 10.100.0.6
Jan 20 09:57:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:57:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:59.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:58:00Z|00632|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.073 226890 DEBUG nova.compute.manager [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.074 226890 DEBUG nova.compute.manager [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.074 226890 DEBUG oslo_concurrency.lockutils [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.074 226890 DEBUG oslo_concurrency.lockutils [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.074 226890 DEBUG nova.network.neutron [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.092 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921066.0883899, ce0152a6-7d4d-4eac-9587-a43ad934d9cc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.093 226890 INFO nova.compute.manager [-] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.095 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.162 226890 DEBUG nova.compute.manager [None req-630ff55f-72fc-46ac-86f9-4b1744d1f3ee - - - - - -] [instance: ce0152a6-7d4d-4eac-9587-a43ad934d9cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:58:01 np0005588920 nova_compute[226886]: 2026-01-20 14:58:01.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:01.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:02.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:04 np0005588920 nova_compute[226886]: 2026-01-20 14:58:04.561 226890 DEBUG nova.compute.manager [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:04 np0005588920 nova_compute[226886]: 2026-01-20 14:58:04.561 226890 DEBUG nova.compute.manager [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:04 np0005588920 nova_compute[226886]: 2026-01-20 14:58:04.562 226890 DEBUG oslo_concurrency.lockutils [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:05 np0005588920 nova_compute[226886]: 2026-01-20 14:58:05.403 226890 DEBUG nova.network.neutron [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:05 np0005588920 nova_compute[226886]: 2026-01-20 14:58:05.403 226890 DEBUG nova.network.neutron [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:05 np0005588920 podman[275843]: 2026-01-20 14:58:05.961255152 +0000 UTC m=+0.050698029 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.465 226890 DEBUG oslo_concurrency.lockutils [req-748e95ee-8cf8-47e7-b902-0a9380bba2ed req-ffed399a-51c8-45a3-aeb4-3b2ab2169c41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.466 226890 DEBUG oslo_concurrency.lockutils [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.467 226890 DEBUG nova.network.neutron [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.726 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.726 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.727 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.727 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.727 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.727 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.864 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 20 09:58:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:06.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.908 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.909 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.909 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.910 226890 WARNING nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.910 226890 WARNING nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.910 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Removable base files: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.911 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.911 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.911 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.911 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.911 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 20 09:58:06 np0005588920 nova_compute[226886]: 2026-01-20 14:58:06.912 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 20 09:58:07 np0005588920 nova_compute[226886]: 2026-01-20 14:58:07.144 226890 DEBUG nova.compute.manager [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:07 np0005588920 nova_compute[226886]: 2026-01-20 14:58:07.145 226890 DEBUG nova.compute.manager [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:07 np0005588920 nova_compute[226886]: 2026-01-20 14:58:07.145 226890 DEBUG oslo_concurrency.lockutils [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:07.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:09.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:10 np0005588920 nova_compute[226886]: 2026-01-20 14:58:10.435 226890 DEBUG nova.network.neutron [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:10 np0005588920 nova_compute[226886]: 2026-01-20 14:58:10.435 226890 DEBUG nova.network.neutron [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:10 np0005588920 nova_compute[226886]: 2026-01-20 14:58:10.465 226890 DEBUG oslo_concurrency.lockutils [req-2b52bae1-9eb6-4692-b42b-129d1f50720f req-82f2c928-faa0-47d9-b709-c074d8a30ce4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:10 np0005588920 nova_compute[226886]: 2026-01-20 14:58:10.465 226890 DEBUG oslo_concurrency.lockutils [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:10 np0005588920 nova_compute[226886]: 2026-01-20 14:58:10.466 226890 DEBUG nova.network.neutron [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:10.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:11 np0005588920 nova_compute[226886]: 2026-01-20 14:58:11.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:11 np0005588920 nova_compute[226886]: 2026-01-20 14:58:11.279 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:11.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:12 np0005588920 nova_compute[226886]: 2026-01-20 14:58:12.537 226890 DEBUG nova.compute.manager [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:12 np0005588920 nova_compute[226886]: 2026-01-20 14:58:12.538 226890 DEBUG nova.compute.manager [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:12 np0005588920 nova_compute[226886]: 2026-01-20 14:58:12.538 226890 DEBUG oslo_concurrency.lockutils [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:12 np0005588920 nova_compute[226886]: 2026-01-20 14:58:12.618 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:12.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:13 np0005588920 nova_compute[226886]: 2026-01-20 14:58:13.251 226890 DEBUG nova.network.neutron [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:13 np0005588920 nova_compute[226886]: 2026-01-20 14:58:13.251 226890 DEBUG nova.network.neutron [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:13 np0005588920 nova_compute[226886]: 2026-01-20 14:58:13.271 226890 DEBUG oslo_concurrency.lockutils [req-c4c1ffa1-9911-4d6a-8314-bea6b5a9674a req-cf5ba3d4-835e-4575-a442-39bf33689cbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:13 np0005588920 nova_compute[226886]: 2026-01-20 14:58:13.271 226890 DEBUG oslo_concurrency.lockutils [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:13 np0005588920 nova_compute[226886]: 2026-01-20 14:58:13.271 226890 DEBUG nova.network.neutron [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:13.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:58:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1767471805' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:58:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:58:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1767471805' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:58:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:14.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:15.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:15 np0005588920 nova_compute[226886]: 2026-01-20 14:58:15.923 226890 DEBUG oslo_concurrency.lockutils [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:15 np0005588920 nova_compute[226886]: 2026-01-20 14:58:15.923 226890 DEBUG oslo_concurrency.lockutils [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:15 np0005588920 nova_compute[226886]: 2026-01-20 14:58:15.948 226890 INFO nova.compute.manager [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Detaching volume 06b5ccf9-bd72-4532-983a-75b3b42cbfe9#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.131 226890 INFO nova.virt.block_device [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attempting to driver detach volume 06b5ccf9-bd72-4532-983a-75b3b42cbfe9 from mountpoint /dev/vdb#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.141 226890 DEBUG nova.virt.libvirt.driver [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdb from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.142 226890 DEBUG nova.virt.libvirt.guest [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-06b5ccf9-bd72-4532-983a-75b3b42cbfe9">
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <serial>06b5ccf9-bd72-4532-983a-75b3b42cbfe9</serial>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:58:16 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.158 226890 INFO nova.virt.libvirt.driver [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the persistent domain config.#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.159 226890 DEBUG nova.virt.libvirt.driver [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.159 226890 DEBUG nova.virt.libvirt.guest [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-06b5ccf9-bd72-4532-983a-75b3b42cbfe9">
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <serial>06b5ccf9-bd72-4532-983a-75b3b42cbfe9</serial>
Jan 20 09:58:16 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:58:16 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:58:16 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.284 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921096.2842603, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.286 226890 DEBUG nova.virt.libvirt.driver [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.288 226890 INFO nova.virt.libvirt.driver [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the live domain config.#033[00m
Jan 20 09:58:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:16.459 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:16.460 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:16.460 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.746 226890 DEBUG nova.objects.instance [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:16.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:16 np0005588920 nova_compute[226886]: 2026-01-20 14:58:16.929 226890 DEBUG oslo_concurrency.lockutils [None req-387b4b3a-fde9-406c-baaa-8fb27771c3ab ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:17 np0005588920 nova_compute[226886]: 2026-01-20 14:58:17.218 226890 DEBUG nova.network.neutron [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:17 np0005588920 nova_compute[226886]: 2026-01-20 14:58:17.219 226890 DEBUG nova.network.neutron [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:17.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:17 np0005588920 nova_compute[226886]: 2026-01-20 14:58:17.626 226890 DEBUG oslo_concurrency.lockutils [req-abe9352a-b11d-464c-9421-0fcd6f31ce70 req-fd620e3c-0232-4081-8112-2815c22042f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:18 np0005588920 nova_compute[226886]: 2026-01-20 14:58:18.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:18.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:19.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:19.999 226890 DEBUG oslo_concurrency.lockutils [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:19.999 226890 DEBUG oslo_concurrency.lockutils [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.037 226890 INFO nova.compute.manager [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Detaching volume 0668acc6-1fde-4609-addf-37a73f135900#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.302 226890 INFO nova.virt.block_device [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Attempting to driver detach volume 0668acc6-1fde-4609-addf-37a73f135900 from mountpoint /dev/vdc#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.312 226890 DEBUG nova.virt.libvirt.driver [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdc from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.313 226890 DEBUG nova.virt.libvirt.guest [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-0668acc6-1fde-4609-addf-37a73f135900">
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <serial>0668acc6-1fde-4609-addf-37a73f135900</serial>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:58:20 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.326 226890 INFO nova.virt.libvirt.driver [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the persistent domain config.#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.327 226890 DEBUG nova.virt.libvirt.driver [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.328 226890 DEBUG nova.virt.libvirt.guest [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-0668acc6-1fde-4609-addf-37a73f135900">
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  </source>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <serial>0668acc6-1fde-4609-addf-37a73f135900</serial>
Jan 20 09:58:20 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:58:20 np0005588920 nova_compute[226886]: </disk>
Jan 20 09:58:20 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.390 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921100.3905988, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.394 226890 DEBUG nova.virt.libvirt.driver [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.396 226890 INFO nova.virt.libvirt.driver [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f from the live domain config.#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.710 226890 DEBUG nova.objects.instance [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:20 np0005588920 nova_compute[226886]: 2026-01-20 14:58:20.767 226890 DEBUG oslo_concurrency.lockutils [None req-166d046f-578f-401e-aac4-9aee2a6718aa ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:20.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:21 np0005588920 nova_compute[226886]: 2026-01-20 14:58:21.168 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:21 np0005588920 nova_compute[226886]: 2026-01-20 14:58:21.283 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:21.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.227 226890 DEBUG nova.compute.manager [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.227 226890 DEBUG nova.compute.manager [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.228 226890 DEBUG oslo_concurrency.lockutils [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.228 226890 DEBUG oslo_concurrency.lockutils [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.228 226890 DEBUG nova.network.neutron [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:22 np0005588920 nova_compute[226886]: 2026-01-20 14:58:22.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:22.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:58:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 49K writes, 202K keys, 49K commit groups, 1.0 writes per commit group, ingest: 0.20 GB, 0.06 MB/s#012Cumulative WAL: 49K writes, 18K syncs, 2.76 writes per sync, written: 0.20 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 13K writes, 58K keys, 13K commit groups, 1.0 writes per commit group, ingest: 59.00 MB, 0.10 MB/s#012Interval WAL: 13K writes, 5391 syncs, 2.59 writes per sync, written: 0.06 GB, 0.10 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:58:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:23.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:23 np0005588920 nova_compute[226886]: 2026-01-20 14:58:23.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:58:23Z|00633|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:23 np0005588920 nova_compute[226886]: 2026-01-20 14:58:23.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:58:23Z|00634|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:23 np0005588920 nova_compute[226886]: 2026-01-20 14:58:23.847 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:24 np0005588920 nova_compute[226886]: 2026-01-20 14:58:24.416 226890 DEBUG nova.network.neutron [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:24 np0005588920 nova_compute[226886]: 2026-01-20 14:58:24.417 226890 DEBUG nova.network.neutron [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:24 np0005588920 nova_compute[226886]: 2026-01-20 14:58:24.440 226890 DEBUG oslo_concurrency.lockutils [req-03fb3ec7-7555-4246-8404-01fbae682ede req-6d84eb33-b3f3-437c-b5d3-92fc76238000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:24.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:25.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:26 np0005588920 nova_compute[226886]: 2026-01-20 14:58:26.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:26 np0005588920 nova_compute[226886]: 2026-01-20 14:58:26.287 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:26.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:27.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:28 np0005588920 podman[275867]: 2026-01-20 14:58:28.044029131 +0000 UTC m=+0.118044026 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:58:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:28.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:29.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:30.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:31 np0005588920 nova_compute[226886]: 2026-01-20 14:58:31.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:31 np0005588920 nova_compute[226886]: 2026-01-20 14:58:31.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:31.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:58:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:32.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:58:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:33.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:34.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:35.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:36 np0005588920 nova_compute[226886]: 2026-01-20 14:58:36.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:36 np0005588920 nova_compute[226886]: 2026-01-20 14:58:36.291 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:36.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:36 np0005588920 podman[275894]: 2026-01-20 14:58:36.964151073 +0000 UTC m=+0.052239202 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:58:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:37.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:37.934 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:58:37 np0005588920 nova_compute[226886]: 2026-01-20 14:58:37.934 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:37.935 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:58:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:38.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:39.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:40.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:41 np0005588920 nova_compute[226886]: 2026-01-20 14:58:41.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:41 np0005588920 nova_compute[226886]: 2026-01-20 14:58:41.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:41.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588920 nova_compute[226886]: 2026-01-20 14:58:42.912 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:42.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:43.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:43 np0005588920 nova_compute[226886]: 2026-01-20 14:58:43.614 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:43 np0005588920 NetworkManager[49076]: <info>  [1768921123.6149] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 20 09:58:43 np0005588920 NetworkManager[49076]: <info>  [1768921123.6160] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 20 09:58:43 np0005588920 nova_compute[226886]: 2026-01-20 14:58:43.859 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:43 np0005588920 ovn_controller[133971]: 2026-01-20T14:58:43Z|00635|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:43 np0005588920 nova_compute[226886]: 2026-01-20 14:58:43.882 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:44 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:44 np0005588920 nova_compute[226886]: 2026-01-20 14:58:44.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:44 np0005588920 nova_compute[226886]: 2026-01-20 14:58:44.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:58:44 np0005588920 nova_compute[226886]: 2026-01-20 14:58:44.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:58:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:44.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:45 np0005588920 nova_compute[226886]: 2026-01-20 14:58:45.411 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:45 np0005588920 nova_compute[226886]: 2026-01-20 14:58:45.411 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:45 np0005588920 nova_compute[226886]: 2026-01-20 14:58:45.412 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:58:45 np0005588920 nova_compute[226886]: 2026-01-20 14:58:45.412 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:45.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:58:45.936 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:58:46 np0005588920 nova_compute[226886]: 2026-01-20 14:58:46.219 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:46 np0005588920 nova_compute[226886]: 2026-01-20 14:58:46.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:46.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:58:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:47.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.451 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.490 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.490 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.490 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.512 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.512 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.512 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.513 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.513 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:48.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:58:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4234632430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:58:48 np0005588920 nova_compute[226886]: 2026-01-20 14:58:48.992 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.059 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.060 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.228 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.229 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4131MB free_disk=20.94619369506836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.230 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.230 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.448 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.449 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.449 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.548 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:58:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:58:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:49.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.665 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.666 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.690 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.719 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:58:49 np0005588920 nova_compute[226886]: 2026-01-20 14:58:49.777 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:58:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2990147587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.196 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.202 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.231 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.263 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.263 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.264 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.265 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.293 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.294 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.540 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.541 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.541 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.541 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:50 np0005588920 nova_compute[226886]: 2026-01-20 14:58:50.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:58:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:50.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:51 np0005588920 nova_compute[226886]: 2026-01-20 14:58:51.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:51 np0005588920 nova_compute[226886]: 2026-01-20 14:58:51.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:51.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:52 np0005588920 nova_compute[226886]: 2026-01-20 14:58:52.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:52.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:53 np0005588920 nova_compute[226886]: 2026-01-20 14:58:53.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:53.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:53 np0005588920 nova_compute[226886]: 2026-01-20 14:58:53.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:54.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:55.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:55 np0005588920 nova_compute[226886]: 2026-01-20 14:58:55.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:55 np0005588920 nova_compute[226886]: 2026-01-20 14:58:55.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:58:56 np0005588920 nova_compute[226886]: 2026-01-20 14:58:56.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:56 np0005588920 nova_compute[226886]: 2026-01-20 14:58:56.322 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:56.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 20 09:58:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:57.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 20 09:58:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:58.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:59 np0005588920 podman[276259]: 2026-01-20 14:58:59.004770426 +0000 UTC m=+0.079381307 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 20 09:58:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:58:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:59.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:00 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:00Z|00636|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:00 np0005588920 nova_compute[226886]: 2026-01-20 14:59:00.887 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:00.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:01 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:01Z|00637|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:01 np0005588920 nova_compute[226886]: 2026-01-20 14:59:01.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:01 np0005588920 nova_compute[226886]: 2026-01-20 14:59:01.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:01 np0005588920 nova_compute[226886]: 2026-01-20 14:59:01.324 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:01.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:02.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:03.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:04.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:05.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:06 np0005588920 nova_compute[226886]: 2026-01-20 14:59:06.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:06 np0005588920 nova_compute[226886]: 2026-01-20 14:59:06.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:06.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:07.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:07 np0005588920 podman[276284]: 2026-01-20 14:59:07.961971048 +0000 UTC m=+0.048119956 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:59:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:08Z|00638|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:08 np0005588920 nova_compute[226886]: 2026-01-20 14:59:08.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:08 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:08Z|00639|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:08 np0005588920 nova_compute[226886]: 2026-01-20 14:59:08.695 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:08.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:09.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:10.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:11 np0005588920 nova_compute[226886]: 2026-01-20 14:59:11.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:11 np0005588920 nova_compute[226886]: 2026-01-20 14:59:11.329 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:11.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:12.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:13.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:14.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:15.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:16 np0005588920 nova_compute[226886]: 2026-01-20 14:59:16.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:16 np0005588920 nova_compute[226886]: 2026-01-20 14:59:16.331 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:16.460 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:16.460 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:16.461 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:16.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:17.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.557 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.557 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.597 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.704 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.705 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.713 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.713 226890 INFO nova.compute.claims [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:59:18 np0005588920 nova_compute[226886]: 2026-01-20 14:59:18.860 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:19.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2897357433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.318 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.324 226890 DEBUG nova.compute.provider_tree [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.341 226890 DEBUG nova.scheduler.client.report [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.365 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.366 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.413 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.414 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.435 226890 INFO nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.454 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:59:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:19.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.800 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.801 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.801 226890 INFO nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Creating image(s)#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.828 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.856 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.882 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.886 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.912 226890 DEBUG nova.policy [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd77d3db3cf924683a608d10efefcd156', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '105e56abe3804424885c7aa8d1216d12', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.948 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.949 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.950 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.950 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.977 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:19 np0005588920 nova_compute[226886]: 2026-01-20 14:59:19.981 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.615 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Successfully created port: e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.676 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.695s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.745 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] resizing rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.853 226890 DEBUG nova.objects.instance [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'migration_context' on Instance uuid bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.867 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.868 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Ensure instance console log exists: /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.868 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.869 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:20 np0005588920 nova_compute[226886]: 2026-01-20 14:59:20.869 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:21.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.233 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:21.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.663 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Successfully updated port: e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.704 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.705 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquired lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.705 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.809 226890 DEBUG nova.compute.manager [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-changed-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.809 226890 DEBUG nova.compute.manager [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Refreshing instance network info cache due to event network-changed-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.810 226890 DEBUG oslo_concurrency.lockutils [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:21 np0005588920 nova_compute[226886]: 2026-01-20 14:59:21.911 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:59:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:23.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.136 226890 DEBUG nova.network.neutron [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Updating instance_info_cache with network_info: [{"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.154 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Releasing lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.155 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Instance network_info: |[{"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.156 226890 DEBUG oslo_concurrency.lockutils [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.156 226890 DEBUG nova.network.neutron [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Refreshing network info cache for port e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.158 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Start _get_guest_xml network_info=[{"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.162 226890 WARNING nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.170 226890 DEBUG nova.virt.libvirt.host [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.171 226890 DEBUG nova.virt.libvirt.host [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.180 226890 DEBUG nova.virt.libvirt.host [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.180 226890 DEBUG nova.virt.libvirt.host [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.181 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.182 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.182 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.182 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.183 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.183 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.183 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.183 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.184 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.184 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.184 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.184 226890 DEBUG nova.virt.hardware [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.187 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:23 np0005588920 NetworkManager[49076]: <info>  [1768921163.5331] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.532 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:23 np0005588920 NetworkManager[49076]: <info>  [1768921163.5342] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Jan 20 09:59:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2388556621' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:23.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.618 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.646 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.649 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.705 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:23 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:23Z|00640|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:23 np0005588920 nova_compute[226886]: 2026-01-20 14:59:23.725 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1151045178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.094 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.096 226890 DEBUG nova.virt.libvirt.vif [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:59:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-134692852',display_name='tempest-ServersNegativeTestJSON-server-134692852',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-134692852',id=144,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-mhw0te8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:59:19Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=bd615b1c-4aa9-4b2e-8d4e-04286e5e1134,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.097 226890 DEBUG nova.network.os_vif_util [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.098 226890 DEBUG nova.network.os_vif_util [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.099 226890 DEBUG nova.objects.instance [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.115 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <uuid>bd615b1c-4aa9-4b2e-8d4e-04286e5e1134</uuid>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <name>instance-00000090</name>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersNegativeTestJSON-server-134692852</nova:name>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:59:23</nova:creationTime>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:user uuid="d77d3db3cf924683a608d10efefcd156">tempest-ServersNegativeTestJSON-1233513591-project-member</nova:user>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:project uuid="105e56abe3804424885c7aa8d1216d12">tempest-ServersNegativeTestJSON-1233513591</nova:project>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <nova:port uuid="e887ca59-5aa6-4f7e-a8f7-3c48c8829f25">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="serial">bd615b1c-4aa9-4b2e-8d4e-04286e5e1134</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="uuid">bd615b1c-4aa9-4b2e-8d4e-04286e5e1134</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:35:64:8f"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <target dev="tape887ca59-5a"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/console.log" append="off"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:59:24 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:59:24 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:59:24 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:59:24 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.117 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Preparing to wait for external event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.117 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.117 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.118 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.118 226890 DEBUG nova.virt.libvirt.vif [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:59:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-134692852',display_name='tempest-ServersNegativeTestJSON-server-134692852',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-134692852',id=144,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-mhw0te8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:59:19Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=bd615b1c-4aa9-4b2e-8d4e-04286e5e1134,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.119 226890 DEBUG nova.network.os_vif_util [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.120 226890 DEBUG nova.network.os_vif_util [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.120 226890 DEBUG os_vif [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.121 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.122 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.126 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape887ca59-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.126 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape887ca59-5a, col_values=(('external_ids', {'iface-id': 'e887ca59-5aa6-4f7e-a8f7-3c48c8829f25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:64:8f', 'vm-uuid': 'bd615b1c-4aa9-4b2e-8d4e-04286e5e1134'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.127 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588920 NetworkManager[49076]: <info>  [1768921164.1287] manager: (tape887ca59-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.130 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.136 226890 INFO os_vif [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a')#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.191 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.192 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.192 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No VIF found with MAC fa:16:3e:35:64:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.193 226890 INFO nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Using config drive#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.217 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.768 226890 INFO nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Creating config drive at /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.773 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qoi1k2x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.904 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qoi1k2x" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.932 226890 DEBUG nova.storage.rbd_utils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.937 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.965 226890 DEBUG nova.network.neutron [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Updated VIF entry in instance network info cache for port e887ca59-5aa6-4f7e-a8f7-3c48c8829f25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.966 226890 DEBUG nova.network.neutron [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Updating instance_info_cache with network_info: [{"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:24 np0005588920 nova_compute[226886]: 2026-01-20 14:59:24.995 226890 DEBUG oslo_concurrency.lockutils [req-87ade745-0d41-4467-a9f7-2df74f101e80 req-15415dbf-a020-47e5-a5c3-d005c21ad195 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:25.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.193 226890 DEBUG oslo_concurrency.processutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.256s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.193 226890 INFO nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Deleting local config drive /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134/disk.config because it was imported into RBD.#033[00m
Jan 20 09:59:25 np0005588920 kernel: tape887ca59-5a: entered promiscuous mode
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.2494] manager: (tape887ca59-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Jan 20 09:59:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:25Z|00641|binding|INFO|Claiming lport e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 for this chassis.
Jan 20 09:59:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:25Z|00642|binding|INFO|e887ca59-5aa6-4f7e-a8f7-3c48c8829f25: Claiming fa:16:3e:35:64:8f 10.100.0.11
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.251 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.260 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:64:8f 10.100.0.11'], port_security=['fa:16:3e:35:64:8f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd615b1c-4aa9-4b2e-8d4e-04286e5e1134', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.262 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e bound to our chassis#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.263 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3aad5d71-9bbf-496d-805e-819d17c4343e#033[00m
Jan 20 09:59:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:25Z|00643|binding|INFO|Setting lport e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 ovn-installed in OVS
Jan 20 09:59:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:25Z|00644|binding|INFO|Setting lport e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 up in Southbound
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.268 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.274 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbb8948-3fa6-4552-8f67-badc33873e97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.275 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3aad5d71-91 in ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.276 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3aad5d71-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.277 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68a1938d-b775-4362-9ac1-95cbb5dba0d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.277 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68aadf1c-2a7f-4880-8391-fc361422fcd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 systemd-udevd[276629]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:59:25 np0005588920 systemd-machined[196121]: New machine qemu-64-instance-00000090.
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.290 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[abf037f5-e596-4960-97f7-14868aa5cc61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.302 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7499acf6-4b07-4f4a-8a7f-fa5fd8cdee57]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.3053] device (tape887ca59-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.3062] device (tape887ca59-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:59:25 np0005588920 systemd[1]: Started Virtual Machine qemu-64-instance-00000090.
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.332 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[42ddbd2e-344c-4fa7-805d-d50877fa51c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.337 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc37c6e-51c3-4e7f-9b7d-6c00678b456c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.3389] manager: (tap3aad5d71-90): new Veth device (/org/freedesktop/NetworkManager/Devices/314)
Jan 20 09:59:25 np0005588920 systemd-udevd[276632]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.369 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c20e66d4-ec4f-4533-884e-417b25c7c55f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.372 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[badd9fe9-f456-47d2-b351-83755c20f37b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.3978] device (tap3aad5d71-90): carrier: link connected
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.402 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[83a334f7-b6a3-43f6-b01d-37a7b496394a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.421 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2692d2e9-734e-4781-97ab-10eaab8e24af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620708, 'reachable_time': 28056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276660, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.439 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e4be4322-7553-4156-b344-fffe9c9e8c3a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:d1c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 620708, 'tstamp': 620708}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276661, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.458 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b1f3c9-92bf-4c12-9d8b-a31f79aa3558]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620708, 'reachable_time': 28056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276662, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.492 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c20f8b9f-ba89-4cf6-8406-d49395bbced5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.546 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f734246a-ae0e-43a7-a72e-1fc5f42bb36b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.547 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.548 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.548 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aad5d71-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:25 np0005588920 kernel: tap3aad5d71-90: entered promiscuous mode
Jan 20 09:59:25 np0005588920 NetworkManager[49076]: <info>  [1768921165.5510] manager: (tap3aad5d71-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.552 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3aad5d71-90, col_values=(('external_ids', {'iface-id': '326d4a7f-b98b-4d21-8fb2-256cf03a3e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:25 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:25Z|00645|binding|INFO|Releasing lport 326d4a7f-b98b-4d21-8fb2-256cf03a3e6a from this chassis (sb_readonly=0)
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.570 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:59:25 np0005588920 nova_compute[226886]: 2026-01-20 14:59:25.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.571 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0424aa01-8172-48fd-9b81-c1c923f9ee89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.572 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:59:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:25.573 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'env', 'PROCESS_TAG=haproxy-3aad5d71-9bbf-496d-805e-819d17c4343e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3aad5d71-9bbf-496d-805e-819d17c4343e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:59:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:25.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:25 np0005588920 podman[276694]: 2026-01-20 14:59:25.92190614 +0000 UTC m=+0.048571760 container create a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:59:25 np0005588920 systemd[1]: Started libpod-conmon-a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9.scope.
Jan 20 09:59:25 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:59:25 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45548df73b11105e9ba84dbfeecf71f87f6769a4355ce7d9eee1f94314c67000/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:25 np0005588920 podman[276694]: 2026-01-20 14:59:25.893848779 +0000 UTC m=+0.020514419 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:59:25 np0005588920 podman[276694]: 2026-01-20 14:59:25.994792313 +0000 UTC m=+0.121457953 container init a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:59:26 np0005588920 podman[276694]: 2026-01-20 14:59:26.002010436 +0000 UTC m=+0.128676056 container start a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:59:26 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [NOTICE]   (276713) : New worker (276715) forked
Jan 20 09:59:26 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [NOTICE]   (276713) : Loading success.
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.194 226890 DEBUG nova.compute.manager [req-ec4a3ed5-7dcf-4b3d-b3c6-4f5a5ef094af req-60b66ffd-f466-4b04-92aa-8cfce7ac73b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.195 226890 DEBUG oslo_concurrency.lockutils [req-ec4a3ed5-7dcf-4b3d-b3c6-4f5a5ef094af req-60b66ffd-f466-4b04-92aa-8cfce7ac73b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.195 226890 DEBUG oslo_concurrency.lockutils [req-ec4a3ed5-7dcf-4b3d-b3c6-4f5a5ef094af req-60b66ffd-f466-4b04-92aa-8cfce7ac73b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.195 226890 DEBUG oslo_concurrency.lockutils [req-ec4a3ed5-7dcf-4b3d-b3c6-4f5a5ef094af req-60b66ffd-f466-4b04-92aa-8cfce7ac73b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.196 226890 DEBUG nova.compute.manager [req-ec4a3ed5-7dcf-4b3d-b3c6-4f5a5ef094af req-60b66ffd-f466-4b04-92aa-8cfce7ac73b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Processing event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.394 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.395 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921166.3937976, bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.395 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] VM Started (Lifecycle Event)#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.398 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.401 226890 INFO nova.virt.libvirt.driver [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Instance spawned successfully.#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.402 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.415 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.421 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.426 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.426 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.427 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.427 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.427 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.428 226890 DEBUG nova.virt.libvirt.driver [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.450 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.452 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921166.3947513, bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.453 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.477 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.481 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921166.397779, bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.481 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.484 226890 INFO nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Took 6.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.484 226890 DEBUG nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.495 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.498 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.541 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.559 226890 INFO nova.compute.manager [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Took 7.91 seconds to build instance.#033[00m
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.574 226890 DEBUG oslo_concurrency.lockutils [None req-2412764e-614f-43ed-80dd-83bc5bd765a8 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:26Z|00646|binding|INFO|Releasing lport 326d4a7f-b98b-4d21-8fb2-256cf03a3e6a from this chassis (sb_readonly=0)
Jan 20 09:59:26 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:26Z|00647|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:26 np0005588920 nova_compute[226886]: 2026-01-20 14:59:26.650 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:27.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:27.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.251 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.252 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.253 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.253 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.253 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.254 226890 INFO nova.compute.manager [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Terminating instance#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.255 226890 DEBUG nova.compute.manager [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:59:28 np0005588920 kernel: tape887ca59-5a (unregistering): left promiscuous mode
Jan 20 09:59:28 np0005588920 NetworkManager[49076]: <info>  [1768921168.2951] device (tape887ca59-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:28Z|00648|binding|INFO|Releasing lport e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 from this chassis (sb_readonly=0)
Jan 20 09:59:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:28Z|00649|binding|INFO|Setting lport e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 down in Southbound
Jan 20 09:59:28 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:28Z|00650|binding|INFO|Removing iface tape887ca59-5a ovn-installed in OVS
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.313 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:64:8f 10.100.0.11'], port_security=['fa:16:3e:35:64:8f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd615b1c-4aa9-4b2e-8d4e-04286e5e1134', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.314 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e unbound from our chassis#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.316 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3aad5d71-9bbf-496d-805e-819d17c4343e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.317 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[92194ed6-19de-428c-a6db-cb365b6c2e91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.317 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace which is not needed anymore#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.343 226890 DEBUG nova.compute.manager [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.344 226890 DEBUG oslo_concurrency.lockutils [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.344 226890 DEBUG oslo_concurrency.lockutils [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.344 226890 DEBUG oslo_concurrency.lockutils [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.344 226890 DEBUG nova.compute.manager [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] No waiting events found dispatching network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.345 226890 WARNING nova.compute.manager [req-9ed08908-82f6-4ed0-89fc-d4c8b535aae4 req-2675fa3f-1640-47e1-99f8-b735829379c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received unexpected event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:59:28 np0005588920 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 20 09:59:28 np0005588920 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000090.scope: Consumed 3.003s CPU time.
Jan 20 09:59:28 np0005588920 systemd-machined[196121]: Machine qemu-64-instance-00000090 terminated.
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [NOTICE]   (276713) : haproxy version is 2.8.14-c23fe91
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [NOTICE]   (276713) : path to executable is /usr/sbin/haproxy
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [WARNING]  (276713) : Exiting Master process...
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [WARNING]  (276713) : Exiting Master process...
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [ALERT]    (276713) : Current worker (276715) exited with code 143 (Terminated)
Jan 20 09:59:28 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[276709]: [WARNING]  (276713) : All workers exited. Exiting... (0)
Jan 20 09:59:28 np0005588920 systemd[1]: libpod-a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9.scope: Deactivated successfully.
Jan 20 09:59:28 np0005588920 podman[276791]: 2026-01-20 14:59:28.445898564 +0000 UTC m=+0.046351086 container died a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.488 226890 INFO nova.virt.libvirt.driver [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Instance destroyed successfully.#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.490 226890 DEBUG nova.objects.instance [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'resources' on Instance uuid bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.506 226890 DEBUG nova.virt.libvirt.vif [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:59:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-134692852',display_name='tempest-ServersNegativeTestJSON-server-134692852',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-134692852',id=144,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:59:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-mhw0te8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:59:26Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=bd615b1c-4aa9-4b2e-8d4e-04286e5e1134,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.507 226890 DEBUG nova.network.os_vif_util [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "address": "fa:16:3e:35:64:8f", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape887ca59-5a", "ovs_interfaceid": "e887ca59-5aa6-4f7e-a8f7-3c48c8829f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.508 226890 DEBUG nova.network.os_vif_util [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.508 226890 DEBUG os_vif [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.511 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape887ca59-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.516 226890 INFO os_vif [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:64:8f,bridge_name='br-int',has_traffic_filtering=True,id=e887ca59-5aa6-4f7e-a8f7-3c48c8829f25,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape887ca59-5a')#033[00m
Jan 20 09:59:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9-userdata-shm.mount: Deactivated successfully.
Jan 20 09:59:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay-45548df73b11105e9ba84dbfeecf71f87f6769a4355ce7d9eee1f94314c67000-merged.mount: Deactivated successfully.
Jan 20 09:59:28 np0005588920 podman[276791]: 2026-01-20 14:59:28.557078916 +0000 UTC m=+0.157531418 container cleanup a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 09:59:28 np0005588920 systemd[1]: libpod-conmon-a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9.scope: Deactivated successfully.
Jan 20 09:59:28 np0005588920 podman[276849]: 2026-01-20 14:59:28.632905532 +0000 UTC m=+0.051480471 container remove a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.638 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b326d3cf-6725-4dc8-a08c-16c9c8c2d0fc]: (4, ('Tue Jan 20 02:59:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9)\na25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9\nTue Jan 20 02:59:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (a25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9)\na25cb92421dc82542f91ed9fe8d6b0eb2b50489eba0e6683a00e4473281438a9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.640 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c42b8d4a-642f-4597-abea-309e0399e63f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.641 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:28 np0005588920 kernel: tap3aad5d71-90: left promiscuous mode
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 nova_compute[226886]: 2026-01-20 14:59:28.657 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.661 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e434d3f-cbad-4d03-9eab-a5f235795f6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.675 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7e56a117-44d0-41ed-b8ae-e21b500c089a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.677 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[54e98273-02ed-4c1c-9043-dcad3059da12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.695 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ee7b46-508e-4ff9-8a74-23b471d361c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620701, 'reachable_time': 19029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276866, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.698 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:59:28 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3aad5d71\x2d9bbf\x2d496d\x2d805e\x2d819d17c4343e.mount: Deactivated successfully.
Jan 20 09:59:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:28.699 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e42c0a-1d5c-4d19-9b58-6e75159ec694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:29.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.093 226890 INFO nova.virt.libvirt.driver [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Deleting instance files /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_del#033[00m
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.094 226890 INFO nova.virt.libvirt.driver [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Deletion of /var/lib/nova/instances/bd615b1c-4aa9-4b2e-8d4e-04286e5e1134_del complete#033[00m
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.205 226890 INFO nova.compute.manager [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Took 0.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.205 226890 DEBUG oslo.service.loopingcall [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.206 226890 DEBUG nova.compute.manager [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.206 226890 DEBUG nova.network.neutron [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:59:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:29.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:29 np0005588920 nova_compute[226886]: 2026-01-20 14:59:29.997 226890 DEBUG nova.network.neutron [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:30 np0005588920 podman[276868]: 2026-01-20 14:59:30.010226598 +0000 UTC m=+0.096672644 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.016 226890 INFO nova.compute.manager [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Took 0.81 seconds to deallocate network for instance.#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.067 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.068 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.071 226890 DEBUG nova.compute.manager [req-f67e0ff2-fd51-45b5-806d-5137904a8a9b req-0043c0d2-34b3-44c1-af58-61e6d2c5a313 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-vif-deleted-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.146 226890 DEBUG oslo_concurrency.processutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.451 226890 DEBUG nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-vif-unplugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.452 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.452 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.452 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.453 226890 DEBUG nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] No waiting events found dispatching network-vif-unplugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.453 226890 WARNING nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received unexpected event network-vif-unplugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.453 226890 DEBUG nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.454 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.454 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.454 226890 DEBUG oslo_concurrency.lockutils [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.454 226890 DEBUG nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] No waiting events found dispatching network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.455 226890 WARNING nova.compute.manager [req-508fd9c7-0505-4b65-9fbd-fc8bd004ffc0 req-208abddf-2d33-497e-987b-b2394255fd73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Received unexpected event network-vif-plugged-e887ca59-5aa6-4f7e-a8f7-3c48c8829f25 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:59:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2951671161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.558 226890 DEBUG oslo_concurrency.processutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.566 226890 DEBUG nova.compute.provider_tree [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.583 226890 DEBUG nova.scheduler.client.report [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.604 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.639 226890 INFO nova.scheduler.client.report [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Deleted allocations for instance bd615b1c-4aa9-4b2e-8d4e-04286e5e1134#033[00m
Jan 20 09:59:30 np0005588920 nova_compute[226886]: 2026-01-20 14:59:30.714 226890 DEBUG oslo_concurrency.lockutils [None req-b543acd9-b7f5-483b-9a65-0a83950abb4b d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "bd615b1c-4aa9-4b2e-8d4e-04286e5e1134" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:31.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:31 np0005588920 nova_compute[226886]: 2026-01-20 14:59:31.343 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:31.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:32 np0005588920 nova_compute[226886]: 2026-01-20 14:59:32.496 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:33.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:33 np0005588920 nova_compute[226886]: 2026-01-20 14:59:33.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:33.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:35.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:35 np0005588920 nova_compute[226886]: 2026-01-20 14:59:35.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:35.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.015 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.035 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Triggering sync for uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.036 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.036 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.070 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:36 np0005588920 nova_compute[226886]: 2026-01-20 14:59:36.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:37.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:37.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:38 np0005588920 nova_compute[226886]: 2026-01-20 14:59:38.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:38 np0005588920 nova_compute[226886]: 2026-01-20 14:59:38.856 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:38 np0005588920 podman[276919]: 2026-01-20 14:59:38.955114096 +0000 UTC m=+0.046520932 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 20 09:59:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:39.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:39 np0005588920 nova_compute[226886]: 2026-01-20 14:59:39.199 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:39.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:41.017 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:41 np0005588920 nova_compute[226886]: 2026-01-20 14:59:41.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:41.018 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:59:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:41.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:41 np0005588920 nova_compute[226886]: 2026-01-20 14:59:41.347 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:41.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:43.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.157 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.158 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.228 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.315 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.316 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.323 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.323 226890 INFO nova.compute.claims [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.486 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921168.4856656, bd615b1c-4aa9-4b2e-8d4e-04286e5e1134 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.486 226890 INFO nova.compute.manager [-] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.506 226890 DEBUG nova.compute.manager [None req-eb0a08cd-a116-4a34-913a-5f6c97131b79 - - - - - -] [instance: bd615b1c-4aa9-4b2e-8d4e-04286e5e1134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.515 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.543 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:43.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/764431920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.975 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.980 226890 DEBUG nova.compute.provider_tree [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:43 np0005588920 nova_compute[226886]: 2026-01-20 14:59:43.999 226890 DEBUG nova.scheduler.client.report [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.015 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.016 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.066 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.066 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.084 226890 INFO nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.100 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.185 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.187 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.187 226890 INFO nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Creating image(s)#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.209 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.232 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.256 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.259 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.320 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.321 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.322 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.322 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.348 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.351 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4960a060-fded-4f60-af04-b810330687b7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.380 226890 DEBUG nova.policy [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a9592dd9fc5492a92e3b21c894f6443', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '19d56573457a4a0ba86eaae5ca9f2e17', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.699 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4960a060-fded-4f60-af04-b810330687b7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.348s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.764 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.770 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] resizing rbd image 4960a060-fded-4f60-af04-b810330687b7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.891 226890 DEBUG nova.objects.instance [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lazy-loading 'migration_context' on Instance uuid 4960a060-fded-4f60-af04-b810330687b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.909 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.910 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Ensure instance console log exists: /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.911 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.911 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:44 np0005588920 nova_compute[226886]: 2026-01-20 14:59:44.911 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:45.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:45.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.059 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Successfully created port: d6df22c3-25fa-42bf-b37a-fec8793d372e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.348 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.756 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.934 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.934 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.934 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:59:46 np0005588920 nova_compute[226886]: 2026-01-20 14:59:46.935 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:47.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:47.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:48 np0005588920 nova_compute[226886]: 2026-01-20 14:59:48.546 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:48 np0005588920 nova_compute[226886]: 2026-01-20 14:59:48.609 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Successfully updated port: d6df22c3-25fa-42bf-b37a-fec8793d372e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:59:48 np0005588920 nova_compute[226886]: 2026-01-20 14:59:48.640 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:48 np0005588920 nova_compute[226886]: 2026-01-20 14:59:48.641 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquired lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:48 np0005588920 nova_compute[226886]: 2026-01-20 14:59:48.641 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:59:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:49.019 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:49.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.513 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.579 226890 DEBUG nova.compute.manager [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-changed-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.580 226890 DEBUG nova.compute.manager [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Refreshing instance network info cache due to event network-changed-d6df22c3-25fa-42bf-b37a-fec8793d372e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.581 226890 DEBUG oslo_concurrency.lockutils [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:49.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.981 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.994 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.995 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.995 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:49 np0005588920 nova_compute[226886]: 2026-01-20 14:59:49.995 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.015 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.016 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.016 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.016 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.017 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2081012519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.442 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.508 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.508 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.669 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.671 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4059MB free_disk=20.92159652709961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.671 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.671 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.730 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.730 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 4960a060-fded-4f60-af04-b810330687b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.731 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.731 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.795 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.825 226890 DEBUG nova.network.neutron [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Updating instance_info_cache with network_info: [{"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.845 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Releasing lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.846 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Instance network_info: |[{"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.846 226890 DEBUG oslo_concurrency.lockutils [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.847 226890 DEBUG nova.network.neutron [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Refreshing network info cache for port d6df22c3-25fa-42bf-b37a-fec8793d372e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.850 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Start _get_guest_xml network_info=[{"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.855 226890 WARNING nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.858 226890 DEBUG nova.virt.libvirt.host [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.859 226890 DEBUG nova.virt.libvirt.host [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.865 226890 DEBUG nova.virt.libvirt.host [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.865 226890 DEBUG nova.virt.libvirt.host [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.866 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.867 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.867 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.867 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.867 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.868 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.868 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.868 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.868 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.868 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.869 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.869 226890 DEBUG nova.virt.hardware [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:59:50 np0005588920 nova_compute[226886]: 2026-01-20 14:59:50.872 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:51.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1540140882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.238 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.246 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.261 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.309 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.310 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4015297812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.329 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.355 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.361 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.386 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:51.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2401738277' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.809 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.811 226890 DEBUG nova.virt.libvirt.vif [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:59:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-2065391372',display_name='tempest-ServerMetadataNegativeTestJSON-server-2065391372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-2065391372',id=145,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19d56573457a4a0ba86eaae5ca9f2e17',ramdisk_id='',reservation_id='r-ms154a1j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-150423748',owner_user_name='tempest-ServerMetadataNegativeTestJSON-150423748-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:59:44Z,user_data=None,user_id='2a9592dd9fc5492a92e3b21c894f6443',uuid=4960a060-fded-4f60-af04-b810330687b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.811 226890 DEBUG nova.network.os_vif_util [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converting VIF {"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.812 226890 DEBUG nova.network.os_vif_util [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.813 226890 DEBUG nova.objects.instance [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4960a060-fded-4f60-af04-b810330687b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.839 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <uuid>4960a060-fded-4f60-af04-b810330687b7</uuid>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <name>instance-00000091</name>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-2065391372</nova:name>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 14:59:50</nova:creationTime>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:user uuid="2a9592dd9fc5492a92e3b21c894f6443">tempest-ServerMetadataNegativeTestJSON-150423748-project-member</nova:user>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:project uuid="19d56573457a4a0ba86eaae5ca9f2e17">tempest-ServerMetadataNegativeTestJSON-150423748</nova:project>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <nova:port uuid="d6df22c3-25fa-42bf-b37a-fec8793d372e">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <system>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="serial">4960a060-fded-4f60-af04-b810330687b7</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="uuid">4960a060-fded-4f60-af04-b810330687b7</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </system>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <os>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </os>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <features>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </features>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </clock>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  <devices>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/4960a060-fded-4f60-af04-b810330687b7_disk">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/4960a060-fded-4f60-af04-b810330687b7_disk.config">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </source>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      </auth>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </disk>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:e2:c2:7f"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <target dev="tapd6df22c3-25"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </interface>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/console.log" append="off"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </serial>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <video>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </video>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </rng>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 09:59:51 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 09:59:51 np0005588920 nova_compute[226886]:  </devices>
Jan 20 09:59:51 np0005588920 nova_compute[226886]: </domain>
Jan 20 09:59:51 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.841 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Preparing to wait for external event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.841 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.841 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.841 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.842 226890 DEBUG nova.virt.libvirt.vif [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:59:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-2065391372',display_name='tempest-ServerMetadataNegativeTestJSON-server-2065391372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-2065391372',id=145,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19d56573457a4a0ba86eaae5ca9f2e17',ramdisk_id='',reservation_id='r-ms154a1j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-150423748',owner_user_name='tempest-ServerMetadataNegativeTestJSON-150423748-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:59:44Z,user_data=None,user_id='2a9592dd9fc5492a92e3b21c894f6443',uuid=4960a060-fded-4f60-af04-b810330687b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.842 226890 DEBUG nova.network.os_vif_util [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converting VIF {"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.843 226890 DEBUG nova.network.os_vif_util [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.843 226890 DEBUG os_vif [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.844 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.844 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.844 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.847 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.847 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6df22c3-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.847 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6df22c3-25, col_values=(('external_ids', {'iface-id': 'd6df22c3-25fa-42bf-b37a-fec8793d372e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:c2:7f', 'vm-uuid': '4960a060-fded-4f60-af04-b810330687b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.848 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:51 np0005588920 NetworkManager[49076]: <info>  [1768921191.8495] manager: (tapd6df22c3-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.851 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.855 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.856 226890 INFO os_vif [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25')#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.859 226890 DEBUG nova.compute.manager [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.859 226890 DEBUG nova.compute.manager [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing instance network info cache due to event network-changed-c0ac6308-ae73-4b17-95fa-47f3df3c4f97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.860 226890 DEBUG oslo_concurrency.lockutils [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.860 226890 DEBUG oslo_concurrency.lockutils [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.860 226890 DEBUG nova.network.neutron [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Refreshing network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.914 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.914 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.915 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] No VIF found with MAC fa:16:3e:e2:c2:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.915 226890 INFO nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Using config drive#033[00m
Jan 20 09:59:51 np0005588920 nova_compute[226886]: 2026-01-20 14:59:51.939 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.040 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.040 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.041 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.041 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.041 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.166 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.298 226890 INFO nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Creating config drive at /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.302 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mntj8ud execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.435 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mntj8ud" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.462 226890 DEBUG nova.storage.rbd_utils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] rbd image 4960a060-fded-4f60-af04-b810330687b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.465 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config 4960a060-fded-4f60-af04-b810330687b7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.634 226890 DEBUG oslo_concurrency.processutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config 4960a060-fded-4f60-af04-b810330687b7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.635 226890 INFO nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Deleting local config drive /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7/disk.config because it was imported into RBD.#033[00m
Jan 20 09:59:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:52 np0005588920 kernel: tapd6df22c3-25: entered promiscuous mode
Jan 20 09:59:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:52Z|00651|binding|INFO|Claiming lport d6df22c3-25fa-42bf-b37a-fec8793d372e for this chassis.
Jan 20 09:59:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:52Z|00652|binding|INFO|d6df22c3-25fa-42bf-b37a-fec8793d372e: Claiming fa:16:3e:e2:c2:7f 10.100.0.13
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.6727] manager: (tapd6df22c3-25): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.673 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:52Z|00653|binding|INFO|Setting lport d6df22c3-25fa-42bf-b37a-fec8793d372e ovn-installed in OVS
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.688 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:52Z|00654|binding|INFO|Setting lport d6df22c3-25fa-42bf-b37a-fec8793d372e up in Southbound
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.691 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:c2:7f 10.100.0.13'], port_security=['fa:16:3e:e2:c2:7f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4960a060-fded-4f60-af04-b810330687b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80d184b1-c656-4714-a34b-5760d08308df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '19d56573457a4a0ba86eaae5ca9f2e17', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ec26b1b3-5cbb-4ff5-9c21-5a0a7d34abba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab706a4-0e9d-4e28-b102-4d921960880a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d6df22c3-25fa-42bf-b37a-fec8793d372e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.692 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.692 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d6df22c3-25fa-42bf-b37a-fec8793d372e in datapath 80d184b1-c656-4714-a34b-5760d08308df bound to our chassis#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.693 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80d184b1-c656-4714-a34b-5760d08308df#033[00m
Jan 20 09:59:52 np0005588920 systemd-udevd[277304]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.705 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfb52ce-3e00-48a4-b5ed-9a0f9bad5b17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.706 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap80d184b1-c1 in ovnmeta-80d184b1-c656-4714-a34b-5760d08308df namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:59:52 np0005588920 systemd-machined[196121]: New machine qemu-65-instance-00000091.
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.708 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap80d184b1-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.708 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8760b6-7f31-4ce5-94cf-2cab386d4abc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.708 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[376efbc8-c885-4fe7-8181-cf121dda7955]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.7109] device (tapd6df22c3-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.7114] device (tapd6df22c3-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.719 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1628ac-4abb-4636-958f-297e8204606a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 systemd[1]: Started Virtual Machine qemu-65-instance-00000091.
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.742 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[46cb68d2-35b8-4f7e-af7c-a11f4fb0b952]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.765 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e127a767-4f25-40d0-ba62-c008606cf009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.770 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a4c7b8-74b4-4288-9daa-66965182019c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.7714] manager: (tap80d184b1-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/318)
Jan 20 09:59:52 np0005588920 systemd-udevd[277309]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.800 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[79703fe7-ea63-4285-9063-df2262c7c6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.802 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[756b4b0b-5e21-4bdc-b26d-b4a17bbb96f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.8212] device (tap80d184b1-c0): carrier: link connected
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.826 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[84699815-bb59-4fd3-a158-d6aa5c6bc18e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.840 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ed387445-1b5b-44ac-95c0-0bc2314bbd38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80d184b1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:c9:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623450, 'reachable_time': 39684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277339, 'error': None, 'target': 'ovnmeta-80d184b1-c656-4714-a34b-5760d08308df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.853 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8bf211-6773-4b66-9f28-d443251d0598]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:c9ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623450, 'tstamp': 623450}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277340, 'error': None, 'target': 'ovnmeta-80d184b1-c656-4714-a34b-5760d08308df', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.868 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[02a43ae0-6652-4c33-a846-f130f6309e26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80d184b1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:c9:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623450, 'reachable_time': 39684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277341, 'error': None, 'target': 'ovnmeta-80d184b1-c656-4714-a34b-5760d08308df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.896 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[80ae3677-44f7-4235-92a3-38e2e3c083e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.955 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8370b323-d8d9-417c-831b-9050cc693cb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.957 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80d184b1-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.957 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.958 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80d184b1-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.959 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 NetworkManager[49076]: <info>  [1768921192.9602] manager: (tap80d184b1-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 20 09:59:52 np0005588920 kernel: tap80d184b1-c0: entered promiscuous mode
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.961 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80d184b1-c0, col_values=(('external_ids', {'iface-id': '5d679212-2351-4b11-b72d-42281796753a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.963 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:52Z|00655|binding|INFO|Releasing lport 5d679212-2351-4b11-b72d-42281796753a from this chassis (sb_readonly=0)
Jan 20 09:59:52 np0005588920 nova_compute[226886]: 2026-01-20 14:59:52.980 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.981 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/80d184b1-c656-4714-a34b-5760d08308df.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/80d184b1-c656-4714-a34b-5760d08308df.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.982 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d2494854-fe9f-4a77-9d5a-5e27d90976d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.982 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-80d184b1-c656-4714-a34b-5760d08308df
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/80d184b1-c656-4714-a34b-5760d08308df.pid.haproxy
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 80d184b1-c656-4714-a34b-5760d08308df
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:59:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:52.984 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-80d184b1-c656-4714-a34b-5760d08308df', 'env', 'PROCESS_TAG=haproxy-80d184b1-c656-4714-a34b-5760d08308df', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/80d184b1-c656-4714-a34b-5760d08308df.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:59:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:53.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.110 226890 DEBUG nova.network.neutron [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Updated VIF entry in instance network info cache for port d6df22c3-25fa-42bf-b37a-fec8793d372e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.110 226890 DEBUG nova.network.neutron [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Updating instance_info_cache with network_info: [{"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.131 226890 DEBUG oslo_concurrency.lockutils [req-327268a6-0b92-4d8a-a187-7fc320fb9537 req-aa1ce7cc-0670-4e07-b233-03d5008e07a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4960a060-fded-4f60-af04-b810330687b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.321 226890 DEBUG nova.compute.manager [req-a6488b07-8dc4-4831-9301-64fee7aec341 req-ac9d71bc-3a30-4624-990f-934d99e1b831 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.322 226890 DEBUG oslo_concurrency.lockutils [req-a6488b07-8dc4-4831-9301-64fee7aec341 req-ac9d71bc-3a30-4624-990f-934d99e1b831 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.322 226890 DEBUG oslo_concurrency.lockutils [req-a6488b07-8dc4-4831-9301-64fee7aec341 req-ac9d71bc-3a30-4624-990f-934d99e1b831 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.323 226890 DEBUG oslo_concurrency.lockutils [req-a6488b07-8dc4-4831-9301-64fee7aec341 req-ac9d71bc-3a30-4624-990f-934d99e1b831 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.323 226890 DEBUG nova.compute.manager [req-a6488b07-8dc4-4831-9301-64fee7aec341 req-ac9d71bc-3a30-4624-990f-934d99e1b831 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Processing event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:59:53 np0005588920 podman[277389]: 2026-01-20 14:59:53.370770249 +0000 UTC m=+0.052472079 container create 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:59:53 np0005588920 systemd[1]: Started libpod-conmon-7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7.scope.
Jan 20 09:59:53 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:59:53 np0005588920 podman[277389]: 2026-01-20 14:59:53.341392391 +0000 UTC m=+0.023094261 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:59:53 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56759175f4209a3a6b8fc6d9b4818a453a16e28cd1e23bba9095b9a296546b04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:53 np0005588920 podman[277389]: 2026-01-20 14:59:53.455573837 +0000 UTC m=+0.137275677 container init 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.455 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.456 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921193.455363, 4960a060-fded-4f60-af04-b810330687b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.456 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] VM Started (Lifecycle Event)#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.459 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.463 226890 INFO nova.virt.libvirt.driver [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] Instance spawned successfully.#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.463 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:59:53 np0005588920 podman[277389]: 2026-01-20 14:59:53.464341544 +0000 UTC m=+0.146043384 container start 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.486 226890 DEBUG nova.network.neutron [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updated VIF entry in instance network info cache for port c0ac6308-ae73-4b17-95fa-47f3df3c4f97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.486 226890 DEBUG nova.network.neutron [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [{"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:53 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [NOTICE]   (277458) : New worker (277465) forked
Jan 20 09:59:53 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [NOTICE]   (277458) : Loading success.
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.552 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.553 226890 DEBUG oslo_concurrency.lockutils [req-2fed5cae-6871-4f0d-af83-6ddf752eb838 req-9e99cac0-32e3-4f4f-9427-f77271b25672 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.557 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.557 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.558 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.558 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.558 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.559 226890 DEBUG nova.virt.libvirt.driver [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.563 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.648 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.648 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921193.4563768, 4960a060-fded-4f60-af04-b810330687b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.649 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:59:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:53.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.675 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.678 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921193.4585521, 4960a060-fded-4f60-af04-b810330687b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.678 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.685 226890 INFO nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Took 9.50 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.686 226890 DEBUG nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.693 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.696 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.717 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.743 226890 INFO nova.compute.manager [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Took 10.46 seconds to build instance.#033[00m
Jan 20 09:59:53 np0005588920 nova_compute[226886]: 2026-01-20 14:59:53.760 226890 DEBUG oslo_concurrency.lockutils [None req-4d689e08-c366-43b8-a5aa-b3e4467449e1 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:59:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:59:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:55.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:55 np0005588920 podman[277838]: 2026-01-20 14:59:55.223301561 +0000 UTC m=+0.038621359 container create 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Jan 20 09:59:55 np0005588920 systemd[1]: Started libpod-conmon-5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d.scope.
Jan 20 09:59:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:59:55 np0005588920 podman[277838]: 2026-01-20 14:59:55.20587698 +0000 UTC m=+0.021196798 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:59:55 np0005588920 podman[277838]: 2026-01-20 14:59:55.333814425 +0000 UTC m=+0.149134233 container init 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 20 09:59:55 np0005588920 podman[277838]: 2026-01-20 14:59:55.347595803 +0000 UTC m=+0.162915601 container start 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 20 09:59:55 np0005588920 podman[277838]: 2026-01-20 14:59:55.351213345 +0000 UTC m=+0.166533143 container attach 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:59:55 np0005588920 suspicious_heyrovsky[277855]: 167 167
Jan 20 09:59:55 np0005588920 systemd[1]: libpod-5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d.scope: Deactivated successfully.
Jan 20 09:59:55 np0005588920 podman[277860]: 2026-01-20 14:59:55.39433087 +0000 UTC m=+0.026630612 container died 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 20 09:59:55 np0005588920 systemd[1]: var-lib-containers-storage-overlay-fb8d7849bb64f36be3abe8e1d70c64fdacad75727584416174abe45ccb457027-merged.mount: Deactivated successfully.
Jan 20 09:59:55 np0005588920 podman[277860]: 2026-01-20 14:59:55.434533282 +0000 UTC m=+0.066833014 container remove 5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_heyrovsky, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:59:55 np0005588920 systemd[1]: libpod-conmon-5c65045d9909edf27e7a9231c9ecbd0763b66b8042dece313b146ea74103e63d.scope: Deactivated successfully.
Jan 20 09:59:55 np0005588920 podman[277882]: 2026-01-20 14:59:55.61621444 +0000 UTC m=+0.044613798 container create 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 20 09:59:55 np0005588920 systemd[1]: Started libpod-conmon-1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74.scope.
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.659 226890 DEBUG nova.compute.manager [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.661 226890 DEBUG oslo_concurrency.lockutils [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.662 226890 DEBUG oslo_concurrency.lockutils [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.662 226890 DEBUG oslo_concurrency.lockutils [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.662 226890 DEBUG nova.compute.manager [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] No waiting events found dispatching network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:55 np0005588920 nova_compute[226886]: 2026-01-20 14:59:55.662 226890 WARNING nova.compute.manager [req-ec799af3-e83c-44b9-a109-65ed562c71e3 req-66c103d4-34c3-4acc-9db8-5e84bc6c3911 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received unexpected event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e for instance with vm_state active and task_state None.#033[00m
Jan 20 09:59:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:55.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 09:59:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/310dbbd3b63c2c38d1a2374373d7c024a1613eb434e520242c2f43e01466c9db/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/310dbbd3b63c2c38d1a2374373d7c024a1613eb434e520242c2f43e01466c9db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/310dbbd3b63c2c38d1a2374373d7c024a1613eb434e520242c2f43e01466c9db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/310dbbd3b63c2c38d1a2374373d7c024a1613eb434e520242c2f43e01466c9db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 09:59:55 np0005588920 podman[277882]: 2026-01-20 14:59:55.59527564 +0000 UTC m=+0.023675018 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:59:55 np0005588920 podman[277882]: 2026-01-20 14:59:55.701763849 +0000 UTC m=+0.130163227 container init 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 09:59:55 np0005588920 podman[277882]: 2026-01-20 14:59:55.707683756 +0000 UTC m=+0.136083114 container start 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 09:59:55 np0005588920 podman[277882]: 2026-01-20 14:59:55.710453574 +0000 UTC m=+0.138852932 container attach 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 09:59:56 np0005588920 nova_compute[226886]: 2026-01-20 14:59:56.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:56 np0005588920 nova_compute[226886]: 2026-01-20 14:59:56.849 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:56 np0005588920 boring_thompson[277899]: [
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:    {
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "available": false,
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "ceph_device": false,
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "lsm_data": {},
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "lvs": [],
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "path": "/dev/sr0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "rejected_reasons": [
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "Has a FileSystem",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "Insufficient space (<5GB)"
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        ],
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        "sys_api": {
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "actuators": null,
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "device_nodes": "sr0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "devname": "sr0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "human_readable_size": "482.00 KB",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "id_bus": "ata",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "model": "QEMU DVD-ROM",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "nr_requests": "2",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "parent": "/dev/sr0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "partitions": {},
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "path": "/dev/sr0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "removable": "1",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "rev": "2.5+",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "ro": "0",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "rotational": "1",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "sas_address": "",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "sas_device_handle": "",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "scheduler_mode": "mq-deadline",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "sectors": 0,
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "sectorsize": "2048",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "size": 493568.0,
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "support_discard": "2048",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "type": "disk",
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:            "vendor": "QEMU"
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:        }
Jan 20 09:59:56 np0005588920 boring_thompson[277899]:    }
Jan 20 09:59:56 np0005588920 boring_thompson[277899]: ]
Jan 20 09:59:56 np0005588920 systemd[1]: libpod-1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74.scope: Deactivated successfully.
Jan 20 09:59:56 np0005588920 systemd[1]: libpod-1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74.scope: Consumed 1.193s CPU time.
Jan 20 09:59:56 np0005588920 podman[277882]: 2026-01-20 14:59:56.898539661 +0000 UTC m=+1.326939019 container died 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:59:56 np0005588920 systemd[1]: var-lib-containers-storage-overlay-310dbbd3b63c2c38d1a2374373d7c024a1613eb434e520242c2f43e01466c9db-merged.mount: Deactivated successfully.
Jan 20 09:59:56 np0005588920 podman[277882]: 2026-01-20 14:59:56.992524407 +0000 UTC m=+1.420923765 container remove 1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_thompson, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:59:57 np0005588920 systemd[1]: libpod-conmon-1ef39a1a5f306184dec7b7ad1fed0cbff30c583f279d33db144e61226156df74.scope: Deactivated successfully.
Jan 20 09:59:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:57.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:57.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.784 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.785 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.785 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.785 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.786 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.787 226890 INFO nova.compute.manager [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Terminating instance#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.788 226890 DEBUG nova.compute.manager [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:59:58 np0005588920 kernel: tapd6df22c3-25 (unregistering): left promiscuous mode
Jan 20 09:59:58 np0005588920 NetworkManager[49076]: <info>  [1768921198.8283] device (tapd6df22c3-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:59:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:58Z|00656|binding|INFO|Releasing lport d6df22c3-25fa-42bf-b37a-fec8793d372e from this chassis (sb_readonly=0)
Jan 20 09:59:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:58Z|00657|binding|INFO|Setting lport d6df22c3-25fa-42bf-b37a-fec8793d372e down in Southbound
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.840 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:58 np0005588920 ovn_controller[133971]: 2026-01-20T14:59:58Z|00658|binding|INFO|Removing iface tapd6df22c3-25 ovn-installed in OVS
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.842 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:58.846 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:c2:7f 10.100.0.13'], port_security=['fa:16:3e:e2:c2:7f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4960a060-fded-4f60-af04-b810330687b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80d184b1-c656-4714-a34b-5760d08308df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '19d56573457a4a0ba86eaae5ca9f2e17', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ec26b1b3-5cbb-4ff5-9c21-5a0a7d34abba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab706a4-0e9d-4e28-b102-4d921960880a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=d6df22c3-25fa-42bf-b37a-fec8793d372e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:58.847 144128 INFO neutron.agent.ovn.metadata.agent [-] Port d6df22c3-25fa-42bf-b37a-fec8793d372e in datapath 80d184b1-c656-4714-a34b-5760d08308df unbound from our chassis#033[00m
Jan 20 09:59:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:58.849 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80d184b1-c656-4714-a34b-5760d08308df, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:59:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:58.850 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[28264eca-e3b1-44cf-a7b6-007e856402d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:58.851 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-80d184b1-c656-4714-a34b-5760d08308df namespace which is not needed anymore#033[00m
Jan 20 09:59:58 np0005588920 nova_compute[226886]: 2026-01-20 14:59:58.867 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:58 np0005588920 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 20 09:59:58 np0005588920 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000091.scope: Consumed 6.158s CPU time.
Jan 20 09:59:58 np0005588920 systemd-machined[196121]: Machine qemu-65-instance-00000091 terminated.
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [NOTICE]   (277458) : haproxy version is 2.8.14-c23fe91
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [NOTICE]   (277458) : path to executable is /usr/sbin/haproxy
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [WARNING]  (277458) : Exiting Master process...
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [WARNING]  (277458) : Exiting Master process...
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [ALERT]    (277458) : Current worker (277465) exited with code 143 (Terminated)
Jan 20 09:59:58 np0005588920 neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df[277430]: [WARNING]  (277458) : All workers exited. Exiting... (0)
Jan 20 09:59:58 np0005588920 systemd[1]: libpod-7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7.scope: Deactivated successfully.
Jan 20 09:59:58 np0005588920 podman[279171]: 2026-01-20 14:59:58.988905221 +0000 UTC m=+0.044119943 container died 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:59:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7-userdata-shm.mount: Deactivated successfully.
Jan 20 09:59:59 np0005588920 systemd[1]: var-lib-containers-storage-overlay-56759175f4209a3a6b8fc6d9b4818a453a16e28cd1e23bba9095b9a296546b04-merged.mount: Deactivated successfully.
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.024 226890 INFO nova.virt.libvirt.driver [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] Instance destroyed successfully.#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.025 226890 DEBUG nova.objects.instance [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lazy-loading 'resources' on Instance uuid 4960a060-fded-4f60-af04-b810330687b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:59 np0005588920 podman[279171]: 2026-01-20 14:59:59.030667088 +0000 UTC m=+0.085881810 container cleanup 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:59:59 np0005588920 systemd[1]: libpod-conmon-7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7.scope: Deactivated successfully.
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.041 226890 DEBUG nova.virt.libvirt.vif [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:59:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-2065391372',display_name='tempest-ServerMetadataNegativeTestJSON-server-2065391372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-2065391372',id=145,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:59:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='19d56573457a4a0ba86eaae5ca9f2e17',ramdisk_id='',reservation_id='r-ms154a1j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-150423748',owner_user_name='tempest-ServerMetadataNegativeTestJSON-150423748-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:59:53Z,user_data=None,user_id='2a9592dd9fc5492a92e3b21c894f6443',uuid=4960a060-fded-4f60-af04-b810330687b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.042 226890 DEBUG nova.network.os_vif_util [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converting VIF {"id": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "address": "fa:16:3e:e2:c2:7f", "network": {"id": "80d184b1-c656-4714-a34b-5760d08308df", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-828529844-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "19d56573457a4a0ba86eaae5ca9f2e17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6df22c3-25", "ovs_interfaceid": "d6df22c3-25fa-42bf-b37a-fec8793d372e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.043 226890 DEBUG nova.network.os_vif_util [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.043 226890 DEBUG os_vif [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.044 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.045 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6df22c3-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.050 226890 INFO os_vif [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:c2:7f,bridge_name='br-int',has_traffic_filtering=True,id=d6df22c3-25fa-42bf-b37a-fec8793d372e,network=Network(80d184b1-c656-4714-a34b-5760d08308df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6df22c3-25')#033[00m
Jan 20 09:59:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:59.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:59 np0005588920 podman[279213]: 2026-01-20 14:59:59.100304529 +0000 UTC m=+0.045871263 container remove 7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.106 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5adb37-3613-42bc-b7a2-d152ae77b525]: (4, ('Tue Jan 20 02:59:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df (7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7)\n7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7\nTue Jan 20 02:59:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-80d184b1-c656-4714-a34b-5760d08308df (7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7)\n7d77636f6f5bb6bf315fedfb4bcb374ecdae0b863c1127f66f0a3d5831bb0cc7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.108 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b098ba9d-3253-4a56-a0d5-755849d7d00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.109 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80d184b1-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.110 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:59 np0005588920 kernel: tap80d184b1-c0: left promiscuous mode
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.124 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.128 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d102842c-bfe0-4a66-8262-6835ec5d87fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.149 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c148565b-bd51-4916-aa4b-2e7cef24512a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.151 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a6c631-ef31-48df-aad4-c66d1c2e8ed8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.166 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[264ee91d-16d0-44b7-963f-48ce2099a118]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623444, 'reachable_time': 24743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279246, 'error': None, 'target': 'ovnmeta-80d184b1-c656-4714-a34b-5760d08308df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.169 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-80d184b1-c656-4714-a34b-5760d08308df deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:59:59 np0005588920 systemd[1]: run-netns-ovnmeta\x2d80d184b1\x2dc656\x2d4714\x2da34b\x2d5760d08308df.mount: Deactivated successfully.
Jan 20 09:59:59 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 14:59:59.169 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[d560e0c4-2872-425a-832e-1c94cdb01c97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.484 226890 INFO nova.virt.libvirt.driver [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Deleting instance files /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7_del#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.485 226890 INFO nova.virt.libvirt.driver [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Deletion of /var/lib/nova/instances/4960a060-fded-4f60-af04-b810330687b7_del complete#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.566 226890 DEBUG nova.compute.manager [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-unplugged-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.567 226890 DEBUG oslo_concurrency.lockutils [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.567 226890 DEBUG oslo_concurrency.lockutils [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.567 226890 DEBUG oslo_concurrency.lockutils [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.568 226890 DEBUG nova.compute.manager [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] No waiting events found dispatching network-vif-unplugged-d6df22c3-25fa-42bf-b37a-fec8793d372e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.568 226890 DEBUG nova.compute.manager [req-599213ca-98be-4124-b24b-c1041a18e458 req-2b86f050-2204-4b82-be53-5b0aa677dff1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-unplugged-d6df22c3-25fa-42bf-b37a-fec8793d372e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.568 226890 INFO nova.compute.manager [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.569 226890 DEBUG oslo.service.loopingcall [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.569 226890 DEBUG nova.compute.manager [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:59:59 np0005588920 nova_compute[226886]: 2026-01-20 14:59:59.569 226890 DEBUG nova.network.neutron [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:59:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 09:59:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:59.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.405 226890 DEBUG nova.network.neutron [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.425 226890 INFO nova.compute.manager [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] Took 0.86 seconds to deallocate network for instance.#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.470 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.471 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.530 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.531 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.531 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.531 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.532 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.533 226890 INFO nova.compute.manager [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Terminating instance#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.534 226890 DEBUG nova.compute.manager [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.560 226890 DEBUG oslo_concurrency.processutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:00 np0005588920 kernel: tapc0ac6308-ae (unregistering): left promiscuous mode
Jan 20 10:00:00 np0005588920 NetworkManager[49076]: <info>  [1768921200.5963] device (tapc0ac6308-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:00:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:00Z|00659|binding|INFO|Releasing lport c0ac6308-ae73-4b17-95fa-47f3df3c4f97 from this chassis (sb_readonly=0)
Jan 20 10:00:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:00Z|00660|binding|INFO|Setting lport c0ac6308-ae73-4b17-95fa-47f3df3c4f97 down in Southbound
Jan 20 10:00:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:00Z|00661|binding|INFO|Removing iface tapc0ac6308-ae ovn-installed in OVS
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.618 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:23:95 10.100.0.6'], port_security=['fa:16:3e:d8:23:95 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c0ac6308-ae73-4b17-95fa-47f3df3c4f97) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.619 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c0ac6308-ae73-4b17-95fa-47f3df3c4f97 in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 unbound from our chassis#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.621 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d966e1-4d26-414a-920e-0be2d77abb59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.622 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6eeafcfa-8a43-4566-864e-9e42863a2bd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.623 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 namespace which is not needed anymore#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.634 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Jan 20 10:00:00 np0005588920 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Consumed 19.990s CPU time.
Jan 20 10:00:00 np0005588920 systemd-machined[196121]: Machine qemu-63-instance-0000008a terminated.
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.718 226890 DEBUG nova.compute.manager [req-600a2719-bd70-4646-8f7d-bb63731a1162 req-104d09a8-f232-473c-8c32-941b7800d830 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-deleted-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:00 np0005588920 podman[279251]: 2026-01-20 15:00:00.72425405 +0000 UTC m=+0.093898426 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:00:00 np0005588920 NetworkManager[49076]: <info>  [1768921200.7509] manager: (tapc0ac6308-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [NOTICE]   (275548) : haproxy version is 2.8.14-c23fe91
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [NOTICE]   (275548) : path to executable is /usr/sbin/haproxy
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [WARNING]  (275548) : Exiting Master process...
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [WARNING]  (275548) : Exiting Master process...
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [ALERT]    (275548) : Current worker (275550) exited with code 143 (Terminated)
Jan 20 10:00:00 np0005588920 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[275544]: [WARNING]  (275548) : All workers exited. Exiting... (0)
Jan 20 10:00:00 np0005588920 systemd[1]: libpod-61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7.scope: Deactivated successfully.
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.768 226890 INFO nova.virt.libvirt.driver [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Instance destroyed successfully.#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.769 226890 DEBUG nova.objects.instance [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'resources' on Instance uuid a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:00 np0005588920 podman[279310]: 2026-01-20 15:00:00.772352855 +0000 UTC m=+0.049835905 container died 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.782 226890 DEBUG nova.virt.libvirt.vif [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:57:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-95666363',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-95666363',id=138,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:57:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-1fx4exun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:44Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.783 226890 DEBUG nova.network.os_vif_util [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "address": "fa:16:3e:d8:23:95", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0ac6308-ae", "ovs_interfaceid": "c0ac6308-ae73-4b17-95fa-47f3df3c4f97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.784 226890 DEBUG nova.network.os_vif_util [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.785 226890 DEBUG os_vif [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.787 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0ac6308-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.789 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.792 226890 INFO os_vif [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:23:95,bridge_name='br-int',has_traffic_filtering=True,id=c0ac6308-ae73-4b17-95fa-47f3df3c4f97,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0ac6308-ae')#033[00m
Jan 20 10:00:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:00:00 np0005588920 systemd[1]: var-lib-containers-storage-overlay-a52d1ebc81a53cb7bde009370b81e65f5cbc9d610279afd6d55f011114b6cc21-merged.mount: Deactivated successfully.
Jan 20 10:00:00 np0005588920 podman[279310]: 2026-01-20 15:00:00.810401777 +0000 UTC m=+0.087884827 container cleanup 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:00:00 np0005588920 systemd[1]: libpod-conmon-61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7.scope: Deactivated successfully.
Jan 20 10:00:00 np0005588920 podman[279360]: 2026-01-20 15:00:00.874537353 +0000 UTC m=+0.041571172 container remove 61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.883 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eebee8c0-95cf-45a3-b727-2f4568de7103]: (4, ('Tue Jan 20 03:00:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 (61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7)\n61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7\nTue Jan 20 03:00:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 (61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7)\n61e705983ad1f62d8e596e0b0feb82f96a1255da48bc536405d21c46838f3ba7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.885 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8072a909-3c6d-441b-a4a9-cc8bcfabc29e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.886 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.888 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 kernel: tap58d966e1-40: left promiscuous mode
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.901 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4de04528-4ce3-46af-b968-4e1cddceb13b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 nova_compute[226886]: 2026-01-20 15:00:00.907 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.920 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[db188aed-f7a7-463b-9b34-d79161efca6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.921 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc1838c-a51f-4b21-b242-0f62d3cb3de8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.936 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbfd9bd-200a-4cbc-a470-b1c7a089ca04]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610480, 'reachable_time': 20999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279377, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:00 np0005588920 systemd[1]: run-netns-ovnmeta\x2d58d966e1\x2d4d26\x2d414a\x2d920e\x2d0be2d77abb59.mount: Deactivated successfully.
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.939 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:00:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:00.939 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[13c2f26e-de83-4998-aa0c-23424faee09a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3689656102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.032 226890 INFO nova.virt.libvirt.driver [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Deleting instance files /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_del#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.033 226890 INFO nova.virt.libvirt.driver [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Deletion of /var/lib/nova/instances/a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f_del complete#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.051 226890 DEBUG oslo_concurrency.processutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.056 226890 DEBUG nova.compute.provider_tree [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:01.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.079 226890 DEBUG nova.scheduler.client.report [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.086 226890 INFO nova.compute.manager [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.086 226890 DEBUG oslo.service.loopingcall [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.086 226890 DEBUG nova.compute.manager [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.087 226890 DEBUG nova.network.neutron [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.115 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.143 226890 INFO nova.scheduler.client.report [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Deleted allocations for instance 4960a060-fded-4f60-af04-b810330687b7#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.225 226890 DEBUG oslo_concurrency.lockutils [None req-5a446681-5d30-44f6-b9e5-e8ad59bffa35 2a9592dd9fc5492a92e3b21c894f6443 19d56573457a4a0ba86eaae5ca9f2e17 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.709 226890 DEBUG nova.compute.manager [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.709 226890 DEBUG oslo_concurrency.lockutils [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4960a060-fded-4f60-af04-b810330687b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.710 226890 DEBUG oslo_concurrency.lockutils [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.710 226890 DEBUG oslo_concurrency.lockutils [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4960a060-fded-4f60-af04-b810330687b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.710 226890 DEBUG nova.compute.manager [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] No waiting events found dispatching network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:01 np0005588920 nova_compute[226886]: 2026-01-20 15:00:01.710 226890 WARNING nova.compute.manager [req-d9d2deb7-4fc4-4bd2-b1fc-7a0610d7d0d5 req-261af3d7-a410-4dd8-b613-8b3c53e93ded 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4960a060-fded-4f60-af04-b810330687b7] Received unexpected event network-vif-plugged-d6df22c3-25fa-42bf-b37a-fec8793d372e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.051 226890 DEBUG nova.network.neutron [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.073 226890 INFO nova.compute.manager [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Took 0.99 seconds to deallocate network for instance.#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.268 226890 INFO nova.compute.manager [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Took 0.19 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.311 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.311 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.382 226890 DEBUG oslo_concurrency.processutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2499902095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.823 226890 DEBUG oslo_concurrency.processutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.828 226890 DEBUG nova.compute.provider_tree [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.839 226890 DEBUG nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-vif-unplugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.840 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.841 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.841 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.841 226890 DEBUG nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] No waiting events found dispatching network-vif-unplugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.841 226890 WARNING nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received unexpected event network-vif-unplugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.842 226890 DEBUG nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.842 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.842 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.842 226890 DEBUG oslo_concurrency.lockutils [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.843 226890 DEBUG nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] No waiting events found dispatching network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.843 226890 WARNING nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received unexpected event network-vif-plugged-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.843 226890 DEBUG nova.compute.manager [req-02b58228-cddf-46a2-86c4-40359d2136af req-a5bb62d7-24b6-413d-b986-319dc50cb7bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Received event network-vif-deleted-c0ac6308-ae73-4b17-95fa-47f3df3c4f97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.846 226890 DEBUG nova.scheduler.client.report [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.869 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:02 np0005588920 nova_compute[226886]: 2026-01-20 15:00:02.914 226890 INFO nova.scheduler.client.report [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Deleted allocations for instance a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f#033[00m
Jan 20 10:00:03 np0005588920 nova_compute[226886]: 2026-01-20 15:00:03.011 226890 DEBUG oslo_concurrency.lockutils [None req-238be0ef-f680-4356-8719-ea03f4a5f5e4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:03.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:00:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:00:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:03.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.016675) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205016768, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1917, "num_deletes": 259, "total_data_size": 4383928, "memory_usage": 4429056, "flush_reason": "Manual Compaction"}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205038148, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2830046, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52607, "largest_seqno": 54519, "table_properties": {"data_size": 2822148, "index_size": 4648, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17420, "raw_average_key_size": 20, "raw_value_size": 2805888, "raw_average_value_size": 3277, "num_data_blocks": 203, "num_entries": 856, "num_filter_entries": 856, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921060, "oldest_key_time": 1768921060, "file_creation_time": 1768921205, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 21570 microseconds, and 6677 cpu microseconds.
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.038251) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2830046 bytes OK
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.038274) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.041257) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.041282) EVENT_LOG_v1 {"time_micros": 1768921205041274, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.041301) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 4375122, prev total WAL file size 4375122, number of live WAL files 2.
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.042582) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373634' seq:72057594037927935, type:22 .. '6C6F676D0032303138' seq:0, type:0; will stop at (end)
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2763KB)], [102(10108KB)]
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205042648, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13181097, "oldest_snapshot_seqno": -1}
Jan 20 10:00:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:05.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 8094 keys, 13022757 bytes, temperature: kUnknown
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205160930, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 13022757, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12967085, "index_size": 34312, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20293, "raw_key_size": 208991, "raw_average_key_size": 25, "raw_value_size": 12821300, "raw_average_value_size": 1584, "num_data_blocks": 1357, "num_entries": 8094, "num_filter_entries": 8094, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921205, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.161141) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 13022757 bytes
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.162532) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.4 rd, 110.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 9.9 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(9.3) write-amplify(4.6) OK, records in: 8634, records dropped: 540 output_compression: NoCompression
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.162552) EVENT_LOG_v1 {"time_micros": 1768921205162543, "job": 64, "event": "compaction_finished", "compaction_time_micros": 118345, "compaction_time_cpu_micros": 28106, "output_level": 6, "num_output_files": 1, "total_output_size": 13022757, "num_input_records": 8634, "num_output_records": 8094, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205163160, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205165155, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.042447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.165208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.165352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.165354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.165356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:05.165357) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:05.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3768612646' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:00:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3768612646' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:00:05 np0005588920 nova_compute[226886]: 2026-01-20 15:00:05.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:05 np0005588920 nova_compute[226886]: 2026-01-20 15:00:05.994 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:06 np0005588920 nova_compute[226886]: 2026-01-20 15:00:06.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:06 np0005588920 nova_compute[226886]: 2026-01-20 15:00:06.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:07.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:07.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:09.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:09.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 20 10:00:09 np0005588920 podman[279454]: 2026-01-20 15:00:09.971837194 +0000 UTC m=+0.053802246 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 20 10:00:10 np0005588920 nova_compute[226886]: 2026-01-20 15:00:10.792 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:11 np0005588920 nova_compute[226886]: 2026-01-20 15:00:11.358 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:11.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:00:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3702275476' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:00:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:00:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3702275476' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:00:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:13.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:14 np0005588920 nova_compute[226886]: 2026-01-20 15:00:14.019 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921199.0175345, 4960a060-fded-4f60-af04-b810330687b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:14 np0005588920 nova_compute[226886]: 2026-01-20 15:00:14.019 226890 INFO nova.compute.manager [-] [instance: 4960a060-fded-4f60-af04-b810330687b7] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:14 np0005588920 nova_compute[226886]: 2026-01-20 15:00:14.047 226890 DEBUG nova.compute.manager [None req-a8e12aec-8675-4396-b2e2-9d5d49fb5fb7 - - - - - -] [instance: 4960a060-fded-4f60-af04-b810330687b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 20 10:00:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:15.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:15.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:15 np0005588920 nova_compute[226886]: 2026-01-20 15:00:15.766 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921200.76456, a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:15 np0005588920 nova_compute[226886]: 2026-01-20 15:00:15.767 226890 INFO nova.compute.manager [-] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:15 np0005588920 nova_compute[226886]: 2026-01-20 15:00:15.788 226890 DEBUG nova.compute.manager [None req-458c11a6-e921-4903-9fdd-14ce1ada8d95 - - - - - -] [instance: a9aa8bf0-76a1-45f7-bd38-a7b9f7d0b64f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:15 np0005588920 nova_compute[226886]: 2026-01-20 15:00:15.794 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:16 np0005588920 nova_compute[226886]: 2026-01-20 15:00:16.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:16.460 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:16.461 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:16.461 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:17.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:17.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:19.084 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:19.084 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:00:19 np0005588920 nova_compute[226886]: 2026-01-20 15:00:19.085 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:19.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:19.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:20 np0005588920 nova_compute[226886]: 2026-01-20 15:00:20.797 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:21.087 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:21.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:21 np0005588920 nova_compute[226886]: 2026-01-20 15:00:21.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:21.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:23.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:23.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:25.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:25 np0005588920 nova_compute[226886]: 2026-01-20 15:00:25.799 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:26 np0005588920 nova_compute[226886]: 2026-01-20 15:00:26.363 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:27.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:27.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:29.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:29.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.771 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.771 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.847 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.934 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.935 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.943 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:00:29 np0005588920 nova_compute[226886]: 2026-01-20 15:00:29.944 226890 INFO nova.compute.claims [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.062 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3920558439' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.491 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.498 226890 DEBUG nova.compute.provider_tree [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.530 226890 DEBUG nova.scheduler.client.report [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.564 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.565 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.622 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.623 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.640 226890 INFO nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.660 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.745 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.747 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.747 226890 INFO nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Creating image(s)#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.773 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.800 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.822 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.826 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.848 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.888 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.889 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.890 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.890 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.915 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.918 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:30 np0005588920 nova_compute[226886]: 2026-01-20 15:00:30.945 226890 DEBUG nova.policy [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48a567f3890d43dbbcd9ee3c302b1772', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0d9bc40379a44b2eaa53062a0c0385d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:00:30 np0005588920 podman[279553]: 2026-01-20 15:00:30.987390744 +0000 UTC m=+0.074964921 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 20 10:00:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:31.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.364 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.387 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.481 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] resizing rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.585 226890 DEBUG nova.objects.instance [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lazy-loading 'migration_context' on Instance uuid 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.602 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.603 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Ensure instance console log exists: /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.603 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.604 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:31 np0005588920 nova_compute[226886]: 2026-01-20 15:00:31.604 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:31.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:32 np0005588920 nova_compute[226886]: 2026-01-20 15:00:32.642 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Successfully created port: 33faa655-6c91-45e8-915c-003e0c036257 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:00:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:33.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:33 np0005588920 nova_compute[226886]: 2026-01-20 15:00:33.604 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Successfully updated port: 33faa655-6c91-45e8-915c-003e0c036257 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:00:33 np0005588920 nova_compute[226886]: 2026-01-20 15:00:33.623 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:33 np0005588920 nova_compute[226886]: 2026-01-20 15:00:33.624 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquired lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:33 np0005588920 nova_compute[226886]: 2026-01-20 15:00:33.624 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:00:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:33.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.012 226890 DEBUG nova.compute.manager [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-changed-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.012 226890 DEBUG nova.compute.manager [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Refreshing instance network info cache due to event network-changed-33faa655-6c91-45e8-915c-003e0c036257. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.013 226890 DEBUG oslo_concurrency.lockutils [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.014 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.957 226890 DEBUG nova.network.neutron [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Updating instance_info_cache with network_info: [{"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.994 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Releasing lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.994 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Instance network_info: |[{"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.995 226890 DEBUG oslo_concurrency.lockutils [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.995 226890 DEBUG nova.network.neutron [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Refreshing network info cache for port 33faa655-6c91-45e8-915c-003e0c036257 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:00:34 np0005588920 nova_compute[226886]: 2026-01-20 15:00:34.999 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Start _get_guest_xml network_info=[{"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.003 226890 WARNING nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.007 226890 DEBUG nova.virt.libvirt.host [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.008 226890 DEBUG nova.virt.libvirt.host [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.014 226890 DEBUG nova.virt.libvirt.host [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.014 226890 DEBUG nova.virt.libvirt.host [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.015 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.016 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.016 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.016 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.017 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.017 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.017 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.017 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.018 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.018 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.018 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.018 226890 DEBUG nova.virt.hardware [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.021 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:35.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/987469144' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.450 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.483 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.487 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:35.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3787191888' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.985 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.987 226890 DEBUG nova.virt.libvirt.vif [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1424320596',display_name='tempest-ServerPasswordTestJSON-server-1424320596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1424320596',id=147,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d9bc40379a44b2eaa53062a0c0385d5',ramdisk_id='',reservation_id='r-ln604x7d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-336768685',owner_user_name='tempest-ServerPasswordTestJSON-336768685-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:30Z,user_data=None,user_id='48a567f3890d43dbbcd9ee3c302b1772',uuid=531b2c8b-4608-4bd8-b0d7-01164d2e0b35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.987 226890 DEBUG nova.network.os_vif_util [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converting VIF {"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.988 226890 DEBUG nova.network.os_vif_util [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:35 np0005588920 nova_compute[226886]: 2026-01-20 15:00:35.989 226890 DEBUG nova.objects.instance [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.014 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <uuid>531b2c8b-4608-4bd8-b0d7-01164d2e0b35</uuid>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <name>instance-00000093</name>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerPasswordTestJSON-server-1424320596</nova:name>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:00:35</nova:creationTime>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:user uuid="48a567f3890d43dbbcd9ee3c302b1772">tempest-ServerPasswordTestJSON-336768685-project-member</nova:user>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:project uuid="0d9bc40379a44b2eaa53062a0c0385d5">tempest-ServerPasswordTestJSON-336768685</nova:project>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <nova:port uuid="33faa655-6c91-45e8-915c-003e0c036257">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="serial">531b2c8b-4608-4bd8-b0d7-01164d2e0b35</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="uuid">531b2c8b-4608-4bd8-b0d7-01164d2e0b35</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:46:30:5c"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <target dev="tap33faa655-6c"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/console.log" append="off"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:00:36 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:00:36 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:00:36 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:00:36 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.015 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Preparing to wait for external event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.015 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.016 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.016 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.017 226890 DEBUG nova.virt.libvirt.vif [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1424320596',display_name='tempest-ServerPasswordTestJSON-server-1424320596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1424320596',id=147,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d9bc40379a44b2eaa53062a0c0385d5',ramdisk_id='',reservation_id='r-ln604x7d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-336768685',owner_user_name='tempest-ServerPasswordTestJSON-336768685-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:30Z,user_data=None,user_id='48a567f3890d43dbbcd9ee3c302b1772',uuid=531b2c8b-4608-4bd8-b0d7-01164d2e0b35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.017 226890 DEBUG nova.network.os_vif_util [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converting VIF {"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.017 226890 DEBUG nova.network.os_vif_util [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.018 226890 DEBUG os_vif [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.018 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.019 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.019 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.021 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.022 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33faa655-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.022 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33faa655-6c, col_values=(('external_ids', {'iface-id': '33faa655-6c91-45e8-915c-003e0c036257', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:30:5c', 'vm-uuid': '531b2c8b-4608-4bd8-b0d7-01164d2e0b35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588920 NetworkManager[49076]: <info>  [1768921236.0250] manager: (tap33faa655-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.026 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.030 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.031 226890 INFO os_vif [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c')#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.227 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.228 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.228 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] No VIF found with MAC fa:16:3e:46:30:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.229 226890 INFO nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Using config drive#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.255 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.367 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.531 226890 DEBUG nova.network.neutron [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Updated VIF entry in instance network info cache for port 33faa655-6c91-45e8-915c-003e0c036257. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.532 226890 DEBUG nova.network.neutron [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Updating instance_info_cache with network_info: [{"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.558 226890 DEBUG oslo_concurrency.lockutils [req-07d979a9-ef37-4d06-af7d-7af8010d7a4f req-e3833a48-7760-4fc9-8cf5-5980b67074f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-531b2c8b-4608-4bd8-b0d7-01164d2e0b35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.662 226890 INFO nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Creating config drive at /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.666 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppidzlbvh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.797 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppidzlbvh" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.824 226890 DEBUG nova.storage.rbd_utils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] rbd image 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.827 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.979 226890 DEBUG oslo_concurrency.processutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config 531b2c8b-4608-4bd8-b0d7-01164d2e0b35_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:36 np0005588920 nova_compute[226886]: 2026-01-20 15:00:36.979 226890 INFO nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Deleting local config drive /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35/disk.config because it was imported into RBD.#033[00m
Jan 20 10:00:37 np0005588920 kernel: tap33faa655-6c: entered promiscuous mode
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.0327] manager: (tap33faa655-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/322)
Jan 20 10:00:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:37Z|00662|binding|INFO|Claiming lport 33faa655-6c91-45e8-915c-003e0c036257 for this chassis.
Jan 20 10:00:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:37Z|00663|binding|INFO|33faa655-6c91-45e8-915c-003e0c036257: Claiming fa:16:3e:46:30:5c 10.100.0.12
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.032 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.037 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.045 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:30:5c 10.100.0.12'], port_security=['fa:16:3e:46:30:5c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '531b2c8b-4608-4bd8-b0d7-01164d2e0b35', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d9bc40379a44b2eaa53062a0c0385d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a0eec58c-efe5-4615-b819-ee5e26cf4d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53f6ca6b-4923-4565-a0c9-c48519a6fc3b, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=33faa655-6c91-45e8-915c-003e0c036257) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.046 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 33faa655-6c91-45e8-915c-003e0c036257 in datapath db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 bound to our chassis#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.047 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11#033[00m
Jan 20 10:00:37 np0005588920 systemd-udevd[279823]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.058 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26cc9f25-20a4-44cb-b382-ddde8e692dbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.059 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb07aec0-e1 in ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.061 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb07aec0-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.061 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45d247ce-7717-4566-ac56-5905a8e395a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.062 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0af3f742-399a-4fd8-a1df-312d9fa945d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 systemd-machined[196121]: New machine qemu-66-instance-00000093.
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.073 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[60bafe77-7206-435f-a57d-462a1f42c116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.0812] device (tap33faa655-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.0825] device (tap33faa655-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:00:37 np0005588920 systemd[1]: Started Virtual Machine qemu-66-instance-00000093.
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.097 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4643a81a-e41f-4848-82ce-947ad531b711]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.107 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:37Z|00664|binding|INFO|Setting lport 33faa655-6c91-45e8-915c-003e0c036257 ovn-installed in OVS
Jan 20 10:00:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:37Z|00665|binding|INFO|Setting lport 33faa655-6c91-45e8-915c-003e0c036257 up in Southbound
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.117 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.125 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[15544495-948a-47de-82b4-e1ed126706f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.129 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bf710a81-2007-4c2c-83ac-c80b80ed2fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 systemd-udevd[279828]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.1305] manager: (tapdb07aec0-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/323)
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.162 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6079b73b-16f4-4b1a-b6af-6756d312e38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.164 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d323e237-819f-43fe-baf1-1ba44f99130d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.1853] device (tapdb07aec0-e0): carrier: link connected
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.190 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3d82ef-5cb5-4760-acbb-b8d03dfd1051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.205 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9fb814-59a8-4a74-9e80-e04baa1eb765]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb07aec0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:07:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627887, 'reachable_time': 41349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279856, 'error': None, 'target': 'ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.220 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b62505da-7384-4e08-966e-bd330c1d5edd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:782'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627887, 'tstamp': 627887}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279857, 'error': None, 'target': 'ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.237 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[071ae1d1-6b8c-4470-bb53-4111fbc90a96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb07aec0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:07:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627887, 'reachable_time': 41349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279858, 'error': None, 'target': 'ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.267 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e06ce4fd-a8bc-4213-8c45-9e57ff6cc1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.324 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[98dd3670-f561-47f3-8ce3-5342cbfc957a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.326 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb07aec0-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.326 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.326 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb07aec0-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.328 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 kernel: tapdb07aec0-e0: entered promiscuous mode
Jan 20 10:00:37 np0005588920 NetworkManager[49076]: <info>  [1768921237.3291] manager: (tapdb07aec0-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.330 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb07aec0-e0, col_values=(('external_ids', {'iface-id': 'e9a89236-88e5-4975-a2fa-32b3cb3d40c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:37Z|00666|binding|INFO|Releasing lport e9a89236-88e5-4975-a2fa-32b3cb3d40c8 from this chassis (sb_readonly=0)
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.346 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.347 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e61638-5de1-4f62-b23e-230beff7ded1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.348 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11.pid.haproxy
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:00:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:37.350 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'env', 'PROCESS_TAG=haproxy-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:00:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.693 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921237.6930208, 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.695 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] VM Started (Lifecycle Event)#033[00m
Jan 20 10:00:37 np0005588920 podman[279931]: 2026-01-20 15:00:37.715437854 +0000 UTC m=+0.058653486 container create 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:00:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:37.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.730 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.734 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921237.693248, 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.734 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:00:37 np0005588920 systemd[1]: Started libpod-conmon-293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619.scope.
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.752 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.756 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:37 np0005588920 nova_compute[226886]: 2026-01-20 15:00:37.775 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:37 np0005588920 podman[279931]: 2026-01-20 15:00:37.685375575 +0000 UTC m=+0.028591227 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:00:37 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:00:37 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e5d10a6f91e85047b86ba0322b5c9f75ba93da76e6b95dade4f6c193132435a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:00:37 np0005588920 podman[279931]: 2026-01-20 15:00:37.797827986 +0000 UTC m=+0.141043618 container init 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 10:00:37 np0005588920 podman[279931]: 2026-01-20 15:00:37.802506519 +0000 UTC m=+0.145722151 container start 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:00:37 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [NOTICE]   (279951) : New worker (279953) forked
Jan 20 10:00:37 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [NOTICE]   (279951) : Loading success.
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.876 226890 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.877 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.878 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.878 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.878 226890 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Processing event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.878 226890 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.879 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.879 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.879 226890 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.880 226890 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] No waiting events found dispatching network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.880 226890 WARNING nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received unexpected event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.881 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.884 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921238.8843586, 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.884 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.886 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.889 226890 INFO nova.virt.libvirt.driver [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Instance spawned successfully.#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.889 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.904 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.908 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.912 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.912 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.913 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.913 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.913 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.914 226890 DEBUG nova.virt.libvirt.driver [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.936 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.966 226890 INFO nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Took 8.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:00:38 np0005588920 nova_compute[226886]: 2026-01-20 15:00:38.966 226890 DEBUG nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:39 np0005588920 nova_compute[226886]: 2026-01-20 15:00:39.031 226890 INFO nova.compute.manager [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Took 9.14 seconds to build instance.#033[00m
Jan 20 10:00:39 np0005588920 nova_compute[226886]: 2026-01-20 15:00:39.055 226890 DEBUG oslo_concurrency.lockutils [None req-1f0b69c1-ac1e-4d13-a631-7aa4fe64ede0 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:39.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:39.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:40 np0005588920 podman[279962]: 2026-01-20 15:00:40.968330455 +0000 UTC m=+0.053997473 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.025 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:41.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.287 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.287 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.288 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.288 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.288 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.289 226890 INFO nova.compute.manager [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Terminating instance#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.290 226890 DEBUG nova.compute.manager [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:00:41 np0005588920 kernel: tap33faa655-6c (unregistering): left promiscuous mode
Jan 20 10:00:41 np0005588920 NetworkManager[49076]: <info>  [1768921241.3361] device (tap33faa655-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.341 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:41Z|00667|binding|INFO|Releasing lport 33faa655-6c91-45e8-915c-003e0c036257 from this chassis (sb_readonly=0)
Jan 20 10:00:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:41Z|00668|binding|INFO|Setting lport 33faa655-6c91-45e8-915c-003e0c036257 down in Southbound
Jan 20 10:00:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:00:41Z|00669|binding|INFO|Removing iface tap33faa655-6c ovn-installed in OVS
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.343 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.349 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:30:5c 10.100.0.12'], port_security=['fa:16:3e:46:30:5c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '531b2c8b-4608-4bd8-b0d7-01164d2e0b35', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d9bc40379a44b2eaa53062a0c0385d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a0eec58c-efe5-4615-b819-ee5e26cf4d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53f6ca6b-4923-4565-a0c9-c48519a6fc3b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=33faa655-6c91-45e8-915c-003e0c036257) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.352 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 33faa655-6c91-45e8-915c-003e0c036257 in datapath db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 unbound from our chassis#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.353 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.354 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[77d8d1f1-7225-460b-9ad6-1bba5f5826a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.355 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 namespace which is not needed anymore#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.368 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000093.scope: Deactivated successfully.
Jan 20 10:00:41 np0005588920 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000093.scope: Consumed 3.141s CPU time.
Jan 20 10:00:41 np0005588920 systemd-machined[196121]: Machine qemu-66-instance-00000093 terminated.
Jan 20 10:00:41 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [NOTICE]   (279951) : haproxy version is 2.8.14-c23fe91
Jan 20 10:00:41 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [NOTICE]   (279951) : path to executable is /usr/sbin/haproxy
Jan 20 10:00:41 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [WARNING]  (279951) : Exiting Master process...
Jan 20 10:00:41 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [ALERT]    (279951) : Current worker (279953) exited with code 143 (Terminated)
Jan 20 10:00:41 np0005588920 neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11[279947]: [WARNING]  (279951) : All workers exited. Exiting... (0)
Jan 20 10:00:41 np0005588920 systemd[1]: libpod-293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619.scope: Deactivated successfully.
Jan 20 10:00:41 np0005588920 podman[280007]: 2026-01-20 15:00:41.485667524 +0000 UTC m=+0.051260645 container died 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619-userdata-shm.mount: Deactivated successfully.
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0e5d10a6f91e85047b86ba0322b5c9f75ba93da76e6b95dade4f6c193132435a-merged.mount: Deactivated successfully.
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.525 226890 INFO nova.virt.libvirt.driver [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Instance destroyed successfully.#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.526 226890 DEBUG nova.objects.instance [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lazy-loading 'resources' on Instance uuid 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:41 np0005588920 podman[280007]: 2026-01-20 15:00:41.530820363 +0000 UTC m=+0.096413494 container cleanup 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:00:41 np0005588920 systemd[1]: libpod-conmon-293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619.scope: Deactivated successfully.
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.557 226890 DEBUG nova.virt.libvirt.vif [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1424320596',display_name='tempest-ServerPasswordTestJSON-server-1424320596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1424320596',id=147,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:00:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0d9bc40379a44b2eaa53062a0c0385d5',ramdisk_id='',reservation_id='r-ln604x7d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-336768685',owner_user_name='tempest-ServerPasswordTestJSON-336768685-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:00:40Z,user_data=None,user_id='48a567f3890d43dbbcd9ee3c302b1772',uuid=531b2c8b-4608-4bd8-b0d7-01164d2e0b35,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.557 226890 DEBUG nova.network.os_vif_util [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converting VIF {"id": "33faa655-6c91-45e8-915c-003e0c036257", "address": "fa:16:3e:46:30:5c", "network": {"id": "db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1536424175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d9bc40379a44b2eaa53062a0c0385d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33faa655-6c", "ovs_interfaceid": "33faa655-6c91-45e8-915c-003e0c036257", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.558 226890 DEBUG nova.network.os_vif_util [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.558 226890 DEBUG os_vif [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.559 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.560 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33faa655-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.564 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.566 226890 INFO os_vif [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:30:5c,bridge_name='br-int',has_traffic_filtering=True,id=33faa655-6c91-45e8-915c-003e0c036257,network=Network(db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33faa655-6c')#033[00m
Jan 20 10:00:41 np0005588920 podman[280045]: 2026-01-20 15:00:41.596860478 +0000 UTC m=+0.044247784 container remove 293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.602 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4fac1070-6c41-4e4d-a6e5-313118386597]: (4, ('Tue Jan 20 03:00:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 (293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619)\n293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619\nTue Jan 20 03:00:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 (293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619)\n293cc00097871f80bfbaaf7631c648b3eee35d2d8e032c7eb3b74108cb676619\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.603 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c60db0-1d87-4fbf-9ca4-729255cbe4e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.604 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb07aec0-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.606 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 kernel: tapdb07aec0-e0: left promiscuous mode
Jan 20 10:00:41 np0005588920 nova_compute[226886]: 2026-01-20 15:00:41.621 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.624 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[260cae7e-eff7-4d1d-a2ba-2e0b8e11bcf9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.643 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2304c85-1e4c-4a27-9c6c-e6902b2cae74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.645 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[323a38dc-962c-4601-8e9a-096cb48cee29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.659 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[29e9f2cb-1bce-4dd3-b04d-52171aabb56e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627880, 'reachable_time': 23496, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280076, 'error': None, 'target': 'ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.661 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db07aec0-e1f0-4aa7-8c5d-ff6cc5298a11 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:00:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:00:41.661 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[9205de56-51f7-4e98-8c98-5b6c2859764f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:41 np0005588920 systemd[1]: run-netns-ovnmeta\x2ddb07aec0\x2de1f0\x2d4aa7\x2d8c5d\x2dff6cc5298a11.mount: Deactivated successfully.
Jan 20 10:00:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:41.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.269 226890 INFO nova.virt.libvirt.driver [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Deleting instance files /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35_del#033[00m
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.269 226890 INFO nova.virt.libvirt.driver [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Deletion of /var/lib/nova/instances/531b2c8b-4608-4bd8-b0d7-01164d2e0b35_del complete#033[00m
Jan 20 10:00:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.678 226890 INFO nova.compute.manager [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Took 1.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.679 226890 DEBUG oslo.service.loopingcall [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.679 226890 DEBUG nova.compute.manager [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:00:42 np0005588920 nova_compute[226886]: 2026-01-20 15:00:42.680 226890 DEBUG nova.network.neutron [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:00:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.512 226890 DEBUG nova.network.neutron [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.649 226890 DEBUG nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-unplugged-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.649 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.649 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.650 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.650 226890 DEBUG nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] No waiting events found dispatching network-vif-unplugged-33faa655-6c91-45e8-915c-003e0c036257 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.650 226890 DEBUG nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-unplugged-33faa655-6c91-45e8-915c-003e0c036257 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.651 226890 DEBUG nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.651 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.651 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.651 226890 DEBUG oslo_concurrency.lockutils [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.652 226890 DEBUG nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] No waiting events found dispatching network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.652 226890 WARNING nova.compute.manager [req-50e4aa2c-5253-412d-91df-441718d33fe2 req-a1393166-c85f-4df3-ae41-4a319bacfa08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received unexpected event network-vif-plugged-33faa655-6c91-45e8-915c-003e0c036257 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.653 226890 INFO nova.compute.manager [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Took 0.97 seconds to deallocate network for instance.#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.694 226890 DEBUG nova.compute.manager [req-209b53f6-8b54-417c-9bc2-723a231b5b0e req-57f43e16-4d88-4231-8cd4-3066692e5ce1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Received event network-vif-deleted-33faa655-6c91-45e8-915c-003e0c036257 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.703 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.703 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:43.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:43 np0005588920 nova_compute[226886]: 2026-01-20 15:00:43.745 226890 DEBUG oslo_concurrency.processutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2942301129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.191 226890 DEBUG oslo_concurrency.processutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.196 226890 DEBUG nova.compute.provider_tree [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.266 226890 DEBUG nova.scheduler.client.report [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.310 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.347 226890 INFO nova.scheduler.client.report [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Deleted allocations for instance 531b2c8b-4608-4bd8-b0d7-01164d2e0b35#033[00m
Jan 20 10:00:44 np0005588920 nova_compute[226886]: 2026-01-20 15:00:44.440 226890 DEBUG oslo_concurrency.lockutils [None req-86b5da06-9ffb-449a-8fdb-3d8373a70044 48a567f3890d43dbbcd9ee3c302b1772 0d9bc40379a44b2eaa53062a0c0385d5 - - default default] Lock "531b2c8b-4608-4bd8-b0d7-01164d2e0b35" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:45.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:45 np0005588920 nova_compute[226886]: 2026-01-20 15:00:45.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:45.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.370 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:00:46 np0005588920 nova_compute[226886]: 2026-01-20 15:00:46.814 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:00:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 20 10:00:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:47.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.820 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.820 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.821 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.821 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:00:48 np0005588920 nova_compute[226886]: 2026-01-20 15:00:48.821 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2122915934' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.225 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.369 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.370 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4304MB free_disk=20.921974182128906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.371 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.371 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.434 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.434 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.450 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 20 10:00:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:49.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4026246915' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.892 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.898 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.939 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.968 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:00:49 np0005588920 nova_compute[226886]: 2026-01-20 15:00:49.968 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.095 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:51.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.371 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.564 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.640089) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251640157, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 773, "num_deletes": 253, "total_data_size": 1297822, "memory_usage": 1312736, "flush_reason": "Manual Compaction"}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251647945, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 855033, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54524, "largest_seqno": 55292, "table_properties": {"data_size": 851339, "index_size": 1474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8784, "raw_average_key_size": 19, "raw_value_size": 843815, "raw_average_value_size": 1904, "num_data_blocks": 66, "num_entries": 443, "num_filter_entries": 443, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921205, "oldest_key_time": 1768921205, "file_creation_time": 1768921251, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 7885 microseconds, and 3224 cpu microseconds.
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.647987) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 855033 bytes OK
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.648004) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.649553) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.649567) EVENT_LOG_v1 {"time_micros": 1768921251649562, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.649584) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1293718, prev total WAL file size 1293718, number of live WAL files 2.
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(834KB)], [105(12MB)]
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251650319, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 13877790, "oldest_snapshot_seqno": -1}
Jan 20 10:00:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:51.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 8019 keys, 11993191 bytes, temperature: kUnknown
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251781799, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 11993191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11939021, "index_size": 33002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20101, "raw_key_size": 208225, "raw_average_key_size": 25, "raw_value_size": 11795460, "raw_average_value_size": 1470, "num_data_blocks": 1295, "num_entries": 8019, "num_filter_entries": 8019, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921251, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.782071) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 11993191 bytes
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.785127) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.5 rd, 91.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.4 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(30.3) write-amplify(14.0) OK, records in: 8537, records dropped: 518 output_compression: NoCompression
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.785143) EVENT_LOG_v1 {"time_micros": 1768921251785136, "job": 66, "event": "compaction_finished", "compaction_time_micros": 131578, "compaction_time_cpu_micros": 26937, "output_level": 6, "num_output_files": 1, "total_output_size": 11993191, "num_input_records": 8537, "num_output_records": 8019, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251785561, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251787388, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.787526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.787533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.787535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.787537) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:00:51.787538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.969 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.970 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.970 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:51 np0005588920 nova_compute[226886]: 2026-01-20 15:00:51.970 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:52 np0005588920 nova_compute[226886]: 2026-01-20 15:00:52.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:52 np0005588920 nova_compute[226886]: 2026-01-20 15:00:52.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:52 np0005588920 nova_compute[226886]: 2026-01-20 15:00:52.724 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:00:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:53.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:53.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 20 10:00:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:55.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:55.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.373 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.525 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921241.5241268, 531b2c8b-4608-4bd8-b0d7-01164d2e0b35 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.525 226890 INFO nova.compute.manager [-] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.566 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.571 226890 DEBUG nova.compute.manager [None req-fb38e89f-5da8-44ae-a518-1a45c87d9d18 - - - - - -] [instance: 531b2c8b-4608-4bd8-b0d7-01164d2e0b35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:56 np0005588920 nova_compute[226886]: 2026-01-20 15:00:56.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:57.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:57.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:59.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:00:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:59.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:01.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:01 np0005588920 nova_compute[226886]: 2026-01-20 15:01:01.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588920 nova_compute[226886]: 2026-01-20 15:01:01.568 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:01.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:01.963 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:01 np0005588920 nova_compute[226886]: 2026-01-20 15:01:01.963 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:01.965 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:01:01 np0005588920 podman[280156]: 2026-01-20 15:01:01.992030308 +0000 UTC m=+0.082265499 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:01:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:03.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:03.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:03.967 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:01:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:01:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:01:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:05.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:05.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:06 np0005588920 nova_compute[226886]: 2026-01-20 15:01:06.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:06 np0005588920 nova_compute[226886]: 2026-01-20 15:01:06.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:07.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:07.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:09.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:09.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.020 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.021 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.021 226890 INFO nova.compute.manager [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Unshelving#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.142 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.143 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.147 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'pci_requests' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.176 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'numa_topology' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.193 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.194 226890 INFO nova.compute.claims [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.325 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1687987888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.794 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.803 226890 DEBUG nova.compute.provider_tree [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.833 226890 DEBUG nova.scheduler.client.report [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:10 np0005588920 nova_compute[226886]: 2026-01-20 15:01:10.866 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:11.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:11 np0005588920 nova_compute[226886]: 2026-01-20 15:01:11.218 226890 INFO nova.network.neutron [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:01:11 np0005588920 nova_compute[226886]: 2026-01-20 15:01:11.379 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:11 np0005588920 nova_compute[226886]: 2026-01-20 15:01:11.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:11.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:11 np0005588920 podman[280334]: 2026-01-20 15:01:11.963347375 +0000 UTC m=+0.050093061 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 20 10:01:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:13.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:01:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3160512075' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:01:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:01:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3160512075' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:01:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:13.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:14 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.501 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.502 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquired lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.502 226890 DEBUG nova.network.neutron [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.705 226890 DEBUG nova.compute.manager [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-changed-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.706 226890 DEBUG nova.compute.manager [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Refreshing instance network info cache due to event network-changed-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:01:14 np0005588920 nova_compute[226886]: 2026-01-20 15:01:14.706 226890 DEBUG oslo_concurrency.lockutils [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:15.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:15.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:16.462 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:16.462 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:16.462 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.464 226890 DEBUG nova.network.neutron [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating instance_info_cache with network_info: [{"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.490 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Releasing lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.492 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.492 226890 INFO nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Creating image(s)#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.520 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.525 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.527 226890 DEBUG oslo_concurrency.lockutils [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.527 226890 DEBUG nova.network.neutron [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Refreshing network info cache for port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.586 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.618 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.621 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "109be4915bfee240c27dea531cf46b9d730699d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.622 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "109be4915bfee240c27dea531cf46b9d730699d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:16 np0005588920 nova_compute[226886]: 2026-01-20 15:01:16.624 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.003 226890 DEBUG nova.virt.libvirt.imagebackend [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/8976571c-92ae-42ce-94dd-a05ec6e308b3/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/8976571c-92ae-42ce-94dd-a05ec6e308b3/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.063 226890 DEBUG nova.virt.libvirt.imagebackend [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/8976571c-92ae-42ce-94dd-a05ec6e308b3/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.064 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] cloning images/8976571c-92ae-42ce-94dd-a05ec6e308b3@snap to None/61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:01:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:17.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.277 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "109be4915bfee240c27dea531cf46b9d730699d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.456 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'migration_context' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:17 np0005588920 nova_compute[226886]: 2026-01-20 15:01:17.526 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] flattening vms/61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:01:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.344 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Image rbd:vms/61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.345 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.345 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Ensure instance console log exists: /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.345 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.345 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.346 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.347 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Start _get_guest_xml network_info=[{"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:00:42Z,direct_url=<?>,disk_format='raw',id=8976571c-92ae-42ce-94dd-a05ec6e308b3,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1098898119-shelved',owner='105e56abe3804424885c7aa8d1216d12',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:00:51Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.352 226890 WARNING nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.358 226890 DEBUG nova.virt.libvirt.host [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.358 226890 DEBUG nova.virt.libvirt.host [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.361 226890 DEBUG nova.virt.libvirt.host [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.361 226890 DEBUG nova.virt.libvirt.host [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.362 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.363 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:00:42Z,direct_url=<?>,disk_format='raw',id=8976571c-92ae-42ce-94dd-a05ec6e308b3,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1098898119-shelved',owner='105e56abe3804424885c7aa8d1216d12',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:00:51Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.363 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.363 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.364 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.364 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.364 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.364 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.365 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.365 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.365 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.365 226890 DEBUG nova.virt.hardware [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.366 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.381 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.573 226890 DEBUG nova.network.neutron [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updated VIF entry in instance network info cache for port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.574 226890 DEBUG nova.network.neutron [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating instance_info_cache with network_info: [{"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.595 226890 DEBUG oslo_concurrency.lockutils [req-194aa528-3a07-4239-9fbc-6fdc033b7875 req-cd9b6e18-be0b-4ca7-b904-31f8122f87f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:01:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1583981785' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.927 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.954 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:18 np0005588920 nova_compute[226886]: 2026-01-20 15:01:18.958 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:19.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:01:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2262638477' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.406 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.408 226890 DEBUG nova.virt.libvirt.vif [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1098898119',display_name='tempest-ServersNegativeTestJSON-server-1098898119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1098898119',id=143,image_ref='8976571c-92ae-42ce-94dd-a05ec6e308b3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:59:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-yf5960ix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member',shelved_at='2026-01-20T15:00:51.478483',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='8976571c-92ae-42ce-94dd-a05ec6e308b3'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:10Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=61ae2c61-01df-4ef1-8aa3-0527a43b1798,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.408 226890 DEBUG nova.network.os_vif_util [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.409 226890 DEBUG nova.network.os_vif_util [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.411 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.428 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <uuid>61ae2c61-01df-4ef1-8aa3-0527a43b1798</uuid>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <name>instance-0000008f</name>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServersNegativeTestJSON-server-1098898119</nova:name>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:01:18</nova:creationTime>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:user uuid="d77d3db3cf924683a608d10efefcd156">tempest-ServersNegativeTestJSON-1233513591-project-member</nova:user>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:project uuid="105e56abe3804424885c7aa8d1216d12">tempest-ServersNegativeTestJSON-1233513591</nova:project>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="8976571c-92ae-42ce-94dd-a05ec6e308b3"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <nova:port uuid="ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="serial">61ae2c61-01df-4ef1-8aa3-0527a43b1798</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="uuid">61ae2c61-01df-4ef1-8aa3-0527a43b1798</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:8c:76:cf"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <target dev="tapee1c78ce-d0"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/console.log" append="off"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:01:19 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:01:19 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:01:19 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:01:19 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.429 226890 DEBUG nova.compute.manager [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Preparing to wait for external event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.430 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.430 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.430 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.431 226890 DEBUG nova.virt.libvirt.vif [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1098898119',display_name='tempest-ServersNegativeTestJSON-server-1098898119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1098898119',id=143,image_ref='8976571c-92ae-42ce-94dd-a05ec6e308b3',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:59:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-yf5960ix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member',shelved_at='2026-01-20T15:00:51.478483',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='8976571c-92ae-42ce-94dd-a05ec6e308b3'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:10Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=61ae2c61-01df-4ef1-8aa3-0527a43b1798,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.431 226890 DEBUG nova.network.os_vif_util [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.432 226890 DEBUG nova.network.os_vif_util [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.432 226890 DEBUG os_vif [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.433 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.434 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.434 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.437 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.438 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee1c78ce-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.439 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapee1c78ce-d0, col_values=(('external_ids', {'iface-id': 'ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:76:cf', 'vm-uuid': '61ae2c61-01df-4ef1-8aa3-0527a43b1798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588920 NetworkManager[49076]: <info>  [1768921279.4412] manager: (tapee1c78ce-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.448 226890 INFO os_vif [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0')#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.528 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.529 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.529 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] No VIF found with MAC fa:16:3e:8c:76:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.530 226890 INFO nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Using config drive#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.553 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.580 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:19 np0005588920 nova_compute[226886]: 2026-01-20 15:01:19.630 226890 DEBUG nova.objects.instance [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'keypairs' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:19.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.014 226890 INFO nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Creating config drive at /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.019 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppo54vo33 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.153 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppo54vo33" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.181 226890 DEBUG nova.storage.rbd_utils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] rbd image 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.185 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.842 226890 DEBUG oslo_concurrency.processutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config 61ae2c61-01df-4ef1-8aa3-0527a43b1798_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.657s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.843 226890 INFO nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Deleting local config drive /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798/disk.config because it was imported into RBD.#033[00m
Jan 20 10:01:20 np0005588920 kernel: tapee1c78ce-d0: entered promiscuous mode
Jan 20 10:01:20 np0005588920 NetworkManager[49076]: <info>  [1768921280.8958] manager: (tapee1c78ce-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Jan 20 10:01:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:20Z|00670|binding|INFO|Claiming lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for this chassis.
Jan 20 10:01:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:20Z|00671|binding|INFO|ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7: Claiming fa:16:3e:8c:76:cf 10.100.0.3
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.897 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.902 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.917 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:76:cf 10.100.0.3'], port_security=['fa:16:3e:8c:76:cf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '61ae2c61-01df-4ef1-8aa3-0527a43b1798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.918 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e bound to our chassis#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.919 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3aad5d71-9bbf-496d-805e-819d17c4343e#033[00m
Jan 20 10:01:20 np0005588920 systemd-machined[196121]: New machine qemu-67-instance-0000008f.
Jan 20 10:01:20 np0005588920 systemd-udevd[280754]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.930 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[77e32cf8-585b-4c22-ac11-754348c175a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.931 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3aad5d71-91 in ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.933 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3aad5d71-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.934 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6a673482-b30e-4e78-b046-a1df4a1ad255]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.934 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[307167c7-aa3a-418a-8bcf-cf1cced87c1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.944 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[77813c93-d7db-4177-8d77-aa3417951144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:20 np0005588920 NetworkManager[49076]: <info>  [1768921280.9468] device (tapee1c78ce-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:20 np0005588920 NetworkManager[49076]: <info>  [1768921280.9475] device (tapee1c78ce-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:20 np0005588920 systemd[1]: Started Virtual Machine qemu-67-instance-0000008f.
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.967 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:20.971 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[286b7fe8-135d-48ec-b9ac-5fed8c4a1039]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:20Z|00672|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 ovn-installed in OVS
Jan 20 10:01:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:20Z|00673|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 up in Southbound
Jan 20 10:01:20 np0005588920 nova_compute[226886]: 2026-01-20 15:01:20.979 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.004 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d56e32-96d1-44a4-804f-7371b88d1cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.010 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[588bf531-724e-43b2-8c09-944ee5e55ffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 NetworkManager[49076]: <info>  [1768921281.0114] manager: (tap3aad5d71-90): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.047 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c0a98c-72b9-4e3a-82bb-a9fc8fca7682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.051 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[aa32583d-d206-4066-b852-6e8344661149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 NetworkManager[49076]: <info>  [1768921281.0750] device (tap3aad5d71-90): carrier: link connected
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.081 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bb66a289-cc59-43ac-9937-e6cd250b163e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.099 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6d653c49-4694-4464-aaf9-6084c300d360]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632276, 'reachable_time': 31436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280786, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.114 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7f43d9-def4-4f25-95f3-4de8e5740b8f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:d1c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632276, 'tstamp': 632276}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280787, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.130 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c48a9e-4c6c-4efd-9bbe-e2dd195201aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632276, 'reachable_time': 31436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280788, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.166 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e235acad-8a61-4b32-8ddd-993118612480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:21.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.237 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0824ee-75db-41ba-978f-be9077d55c3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.239 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.239 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.239 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aad5d71-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:21 np0005588920 NetworkManager[49076]: <info>  [1768921281.2420] manager: (tap3aad5d71-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 20 10:01:21 np0005588920 kernel: tap3aad5d71-90: entered promiscuous mode
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.244 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3aad5d71-90, col_values=(('external_ids', {'iface-id': '326d4a7f-b98b-4d21-8fb2-256cf03a3e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:21Z|00674|binding|INFO|Releasing lport 326d4a7f-b98b-4d21-8fb2-256cf03a3e6a from this chassis (sb_readonly=0)
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.247 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.248 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b57947bc-af77-4ae7-b19a-1d84502377ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.248 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:01:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:21.249 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'env', 'PROCESS_TAG=haproxy-3aad5d71-9bbf-496d-805e-819d17c4343e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3aad5d71-9bbf-496d-805e-819d17c4343e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.262 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.337 226890 DEBUG nova.compute.manager [req-2be23ad9-8421-4f1f-9309-01afbeba85d8 req-2bd0f4d9-1d4a-48a9-97b0-5c77319886a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.337 226890 DEBUG oslo_concurrency.lockutils [req-2be23ad9-8421-4f1f-9309-01afbeba85d8 req-2bd0f4d9-1d4a-48a9-97b0-5c77319886a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.338 226890 DEBUG oslo_concurrency.lockutils [req-2be23ad9-8421-4f1f-9309-01afbeba85d8 req-2bd0f4d9-1d4a-48a9-97b0-5c77319886a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.338 226890 DEBUG oslo_concurrency.lockutils [req-2be23ad9-8421-4f1f-9309-01afbeba85d8 req-2bd0f4d9-1d4a-48a9-97b0-5c77319886a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.338 226890 DEBUG nova.compute.manager [req-2be23ad9-8421-4f1f-9309-01afbeba85d8 req-2bd0f4d9-1d4a-48a9-97b0-5c77319886a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Processing event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.381 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.609 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921281.608411, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.610 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.612 226890 DEBUG nova.compute.manager [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.615 226890 DEBUG nova.virt.libvirt.driver [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.619 226890 INFO nova.virt.libvirt.driver [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Instance spawned successfully.#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.633 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.637 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:21 np0005588920 podman[280859]: 2026-01-20 15:01:21.566561452 +0000 UTC m=+0.025451107 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:01:21 np0005588920 podman[280859]: 2026-01-20 15:01:21.672701312 +0000 UTC m=+0.131590947 container create 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:01:21 np0005588920 systemd[1]: Started libpod-conmon-58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f.scope.
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.725 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.727 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921281.6087606, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.727 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:01:21 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:01:21 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30761209e3c1cb8cd042048e8ac1f5d57210597d824940d422fa901a51db4fe3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.758 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.762 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921281.61512, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.763 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:21 np0005588920 podman[280859]: 2026-01-20 15:01:21.764974297 +0000 UTC m=+0.223863962 container init 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:01:21 np0005588920 podman[280859]: 2026-01-20 15:01:21.771062 +0000 UTC m=+0.229951685 container start 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 10:01:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.788 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.791 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:21 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [NOTICE]   (280881) : New worker (280883) forked
Jan 20 10:01:21 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [NOTICE]   (280881) : Loading success.
Jan 20 10:01:21 np0005588920 nova_compute[226886]: 2026-01-20 15:01:21.826 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:01:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 20 10:01:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:23.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.238 226890 DEBUG nova.compute.manager [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.331 226890 DEBUG oslo_concurrency.lockutils [None req-fd63cff8-ceff-4eb7-8514-e0112e4777ce d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 13.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.416 226890 DEBUG nova.compute.manager [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.417 226890 DEBUG oslo_concurrency.lockutils [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.417 226890 DEBUG oslo_concurrency.lockutils [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.418 226890 DEBUG oslo_concurrency.lockutils [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.418 226890 DEBUG nova.compute.manager [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:23 np0005588920 nova_compute[226886]: 2026-01-20 15:01:23.418 226890 WARNING nova.compute.manager [req-e408b6a3-f1b3-4274-a7dd-477ed03730ab req-db89ea5f-52aa-41b9-a2e6-37cb92ccd45e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:24 np0005588920 nova_compute[226886]: 2026-01-20 15:01:24.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:25.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:25.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:26 np0005588920 nova_compute[226886]: 2026-01-20 15:01:26.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:27.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:27.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:29.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.443 226890 DEBUG nova.objects.instance [None req-da36202a-50bf-4f71-bb50-51dba453f256 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.508 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.516 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921289.516256, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.517 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.535 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.539 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:29 np0005588920 nova_compute[226886]: 2026-01-20 15:01:29.565 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 10:01:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:29.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 20 10:01:30 np0005588920 kernel: tapee1c78ce-d0 (unregistering): left promiscuous mode
Jan 20 10:01:30 np0005588920 NetworkManager[49076]: <info>  [1768921290.0948] device (tapee1c78ce-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:01:30 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:30Z|00675|binding|INFO|Releasing lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 from this chassis (sb_readonly=0)
Jan 20 10:01:30 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:30Z|00676|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 down in Southbound
Jan 20 10:01:30 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:30Z|00677|binding|INFO|Removing iface tapee1c78ce-d0 ovn-installed in OVS
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.116 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:76:cf 10.100.0.3'], port_security=['fa:16:3e:8c:76:cf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '61ae2c61-01df-4ef1-8aa3-0527a43b1798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '9', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.118 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e unbound from our chassis#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.119 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3aad5d71-9bbf-496d-805e-819d17c4343e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.120 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d36d8b08-89a3-48f4-a0a2-fa42f423e467]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.121 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace which is not needed anymore#033[00m
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.153 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:30 np0005588920 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 20 10:01:30 np0005588920 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008f.scope: Consumed 8.932s CPU time.
Jan 20 10:01:30 np0005588920 systemd-machined[196121]: Machine qemu-67-instance-0000008f terminated.
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.281 226890 DEBUG nova.compute.manager [None req-da36202a-50bf-4f71-bb50-51dba453f256 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:30 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [NOTICE]   (280881) : haproxy version is 2.8.14-c23fe91
Jan 20 10:01:30 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [NOTICE]   (280881) : path to executable is /usr/sbin/haproxy
Jan 20 10:01:30 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [WARNING]  (280881) : Exiting Master process...
Jan 20 10:01:30 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [ALERT]    (280881) : Current worker (280883) exited with code 143 (Terminated)
Jan 20 10:01:30 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[280877]: [WARNING]  (280881) : All workers exited. Exiting... (0)
Jan 20 10:01:30 np0005588920 systemd[1]: libpod-58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f.scope: Deactivated successfully.
Jan 20 10:01:30 np0005588920 podman[280922]: 2026-01-20 15:01:30.300454842 +0000 UTC m=+0.060861128 container died 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:01:30 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:01:30 np0005588920 systemd[1]: var-lib-containers-storage-overlay-30761209e3c1cb8cd042048e8ac1f5d57210597d824940d422fa901a51db4fe3-merged.mount: Deactivated successfully.
Jan 20 10:01:30 np0005588920 podman[280922]: 2026-01-20 15:01:30.341230236 +0000 UTC m=+0.101636502 container cleanup 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:01:30 np0005588920 systemd[1]: libpod-conmon-58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f.scope: Deactivated successfully.
Jan 20 10:01:30 np0005588920 podman[280964]: 2026-01-20 15:01:30.408151177 +0000 UTC m=+0.041791514 container remove 58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.414 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d5a5ff-434d-459e-b149-b8bd75458700]: (4, ('Tue Jan 20 03:01:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f)\n58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f\nTue Jan 20 03:01:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f)\n58434293c9967c4c9244ca2cd6687ba36d5701781b9d176ccfb289e33e5bec7f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.416 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[439d1380-9a2f-4ada-8d21-396dc0334596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.417 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.420 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:30 np0005588920 kernel: tap3aad5d71-90: left promiscuous mode
Jan 20 10:01:30 np0005588920 nova_compute[226886]: 2026-01-20 15:01:30.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.441 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17785ae6-1906-4ed5-9c98-fc7554bb3e57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.461 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a491776a-80ae-4dff-a656-a16b2c5e3ac5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.462 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[99fb4b56-d49e-4f8f-806f-284007b579b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.479 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0f94c7c7-6dac-4da9-bf39-f4bd0f29c9fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632268, 'reachable_time': 20577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280982, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:30 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3aad5d71\x2d9bbf\x2d496d\x2d805e\x2d819d17c4343e.mount: Deactivated successfully.
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.482 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:01:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:30.482 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[81fbcaa4-c1e7-44fd-9326-1bfc01dfdb7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:31.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:31 np0005588920 nova_compute[226886]: 2026-01-20 15:01:31.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:31.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:32 np0005588920 nova_compute[226886]: 2026-01-20 15:01:32.369 226890 INFO nova.compute.manager [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Resuming#033[00m
Jan 20 10:01:32 np0005588920 nova_compute[226886]: 2026-01-20 15:01:32.370 226890 DEBUG nova.objects.instance [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'flavor' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:32 np0005588920 nova_compute[226886]: 2026-01-20 15:01:32.418 226890 DEBUG oslo_concurrency.lockutils [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:32 np0005588920 nova_compute[226886]: 2026-01-20 15:01:32.419 226890 DEBUG oslo_concurrency.lockutils [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquired lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:32 np0005588920 nova_compute[226886]: 2026-01-20 15:01:32.419 226890 DEBUG nova.network.neutron [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:33 np0005588920 podman[280983]: 2026-01-20 15:01:33.061235996 +0000 UTC m=+0.140630916 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 20 10:01:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:33.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.517 226890 DEBUG nova.compute.manager [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.518 226890 DEBUG oslo_concurrency.lockutils [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.518 226890 DEBUG oslo_concurrency.lockutils [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.518 226890 DEBUG oslo_concurrency.lockutils [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.518 226890 DEBUG nova.compute.manager [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:33 np0005588920 nova_compute[226886]: 2026-01-20 15:01:33.518 226890 WARNING nova.compute.manager [req-622f8822-d355-4f4d-b8ea-706e229fa31d req-6226acac-704a-4fc8-bbc2-d718217c8c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 10:01:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:34.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.510 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.659 226890 DEBUG nova.network.neutron [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating instance_info_cache with network_info: [{"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.688 226890 DEBUG oslo_concurrency.lockutils [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Releasing lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.693 226890 DEBUG nova.virt.libvirt.vif [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1098898119',display_name='tempest-ServersNegativeTestJSON-server-1098898119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1098898119',id=143,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:01:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-yf5960ix',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:01:30Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=61ae2c61-01df-4ef1-8aa3-0527a43b1798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.694 226890 DEBUG nova.network.os_vif_util [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.695 226890 DEBUG nova.network.os_vif_util [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.695 226890 DEBUG os_vif [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.696 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.697 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.697 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.701 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.702 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee1c78ce-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.702 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapee1c78ce-d0, col_values=(('external_ids', {'iface-id': 'ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:76:cf', 'vm-uuid': '61ae2c61-01df-4ef1-8aa3-0527a43b1798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.703 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.703 226890 INFO os_vif [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0')#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.726 226890 DEBUG nova.objects.instance [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'numa_topology' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:34 np0005588920 kernel: tapee1c78ce-d0: entered promiscuous mode
Jan 20 10:01:34 np0005588920 NetworkManager[49076]: <info>  [1768921294.7860] manager: (tapee1c78ce-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:34Z|00678|binding|INFO|Claiming lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for this chassis.
Jan 20 10:01:34 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:34Z|00679|binding|INFO|ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7: Claiming fa:16:3e:8c:76:cf 10.100.0.3
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.792 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:76:cf 10.100.0.3'], port_security=['fa:16:3e:8c:76:cf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '61ae2c61-01df-4ef1-8aa3-0527a43b1798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.793 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e bound to our chassis#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.794 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3aad5d71-9bbf-496d-805e-819d17c4343e#033[00m
Jan 20 10:01:34 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:34Z|00680|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 ovn-installed in OVS
Jan 20 10:01:34 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:34Z|00681|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 up in Southbound
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.804 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.806 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8716ef2-7359-486f-b8c6-2b3f13284489]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.806 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3aad5d71-91 in ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.808 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3aad5d71-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.808 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e60054dd-72c5-4886-afa8-c4cdca3b432e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 nova_compute[226886]: 2026-01-20 15:01:34.808 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.809 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e63ae411-a3a1-46c1-9423-25a7c3290034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 systemd-udevd[281027]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.820 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb12648-5e3a-4f5e-bd69-ff00c0ce4bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 systemd-machined[196121]: New machine qemu-68-instance-0000008f.
Jan 20 10:01:34 np0005588920 NetworkManager[49076]: <info>  [1768921294.8298] device (tapee1c78ce-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:34 np0005588920 NetworkManager[49076]: <info>  [1768921294.8312] device (tapee1c78ce-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:34 np0005588920 systemd[1]: Started Virtual Machine qemu-68-instance-0000008f.
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.832 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[489ef016-001c-4628-b2a0-8758afc064a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.861 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7b391e-9d0c-42cf-9395-af85e3760a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 NetworkManager[49076]: <info>  [1768921294.8677] manager: (tap3aad5d71-90): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.866 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[05839609-053a-4c11-a02e-6305ebe55428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.894 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dc5171-ada6-4d6d-9e1d-d9f37a12d6f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.897 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b3dddfa1-44e0-4aa0-8f0a-f8b170de47c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 NetworkManager[49076]: <info>  [1768921294.9209] device (tap3aad5d71-90): carrier: link connected
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.925 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbe2aac-7909-457b-ae89-f3c6f1d2b052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.941 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1e14190d-e452-404a-8ada-ed3af3a04bae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 633660, 'reachable_time': 43593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281058, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.958 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dbffdbcc-d6cf-4c65-9043-870c1eb8cf87]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:d1c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 633660, 'tstamp': 633660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281059, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:34.975 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0941f5ad-95c6-4380-a42d-a97890d6efda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3aad5d71-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:0d:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 633660, 'reachable_time': 43593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281060, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.003 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e816c2-4904-4464-ae7c-81ba4e9163e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.060 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c241b497-35b5-42aa-9c1e-5c8b0647eb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.061 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.061 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.062 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aad5d71-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.063 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:35 np0005588920 NetworkManager[49076]: <info>  [1768921295.0643] manager: (tap3aad5d71-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 20 10:01:35 np0005588920 kernel: tap3aad5d71-90: entered promiscuous mode
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.069 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3aad5d71-90, col_values=(('external_ids', {'iface-id': '326d4a7f-b98b-4d21-8fb2-256cf03a3e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:35 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:35Z|00682|binding|INFO|Releasing lport 326d4a7f-b98b-4d21-8fb2-256cf03a3e6a from this chassis (sb_readonly=0)
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.074 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.077 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[38a77daf-ea49-4d9b-8b76-410b91e1c83a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.078 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3aad5d71-9bbf-496d-805e-819d17c4343e.pid.haproxy
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3aad5d71-9bbf-496d-805e-819d17c4343e
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:01:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:35.079 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'env', 'PROCESS_TAG=haproxy-3aad5d71-9bbf-496d-805e-819d17c4343e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3aad5d71-9bbf-496d-805e-819d17c4343e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:35.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:35 np0005588920 podman[281110]: 2026-01-20 15:01:35.465218314 +0000 UTC m=+0.060292952 container create 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 20 10:01:35 np0005588920 podman[281110]: 2026-01-20 15:01:35.430879234 +0000 UTC m=+0.025953892 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:01:35 np0005588920 systemd[1]: Started libpod-conmon-32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81.scope.
Jan 20 10:01:35 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:01:35 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf70729eb1c3f91959e090fac6269215bbe17670d4e80b50683f47548b3ced72/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:01:35 np0005588920 podman[281110]: 2026-01-20 15:01:35.579951439 +0000 UTC m=+0.175026077 container init 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:01:35 np0005588920 podman[281110]: 2026-01-20 15:01:35.585110887 +0000 UTC m=+0.180185525 container start 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:01:35 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [NOTICE]   (281153) : New worker (281155) forked
Jan 20 10:01:35 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [NOTICE]   (281153) : Loading success.
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.637 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 61ae2c61-01df-4ef1-8aa3-0527a43b1798 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.638 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921295.6365948, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.638 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.663 226890 DEBUG nova.compute.manager [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.664 226890 DEBUG nova.objects.instance [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.716 226890 DEBUG nova.compute.manager [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.717 226890 DEBUG oslo_concurrency.lockutils [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.717 226890 DEBUG oslo_concurrency.lockutils [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.717 226890 DEBUG oslo_concurrency.lockutils [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.718 226890 DEBUG nova.compute.manager [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.718 226890 WARNING nova.compute.manager [req-54474457-6aa1-48be-97a9-bbb5c8d13f54 req-84de6932-0b8f-41fd-aaa5-f6c892afeb1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.824 226890 INFO nova.virt.libvirt.driver [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Instance running successfully.#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.826 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:35 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.829 226890 DEBUG nova.virt.libvirt.guest [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.829 226890 DEBUG nova.compute.manager [None req-3b776f86-1ec9-4485-bff5-98da937cdf2e d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:35 np0005588920 nova_compute[226886]: 2026-01-20 15:01:35.831 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.064 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.065 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921295.6476617, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.066 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.099 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.102 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:36.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:36 np0005588920 nova_compute[226886]: 2026-01-20 15:01:36.432 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 20 10:01:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:37.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:38.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:39.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:39 np0005588920 nova_compute[226886]: 2026-01-20 15:01:39.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:40.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:41.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:41 np0005588920 nova_compute[226886]: 2026-01-20 15:01:41.437 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:41Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:76:cf 10.100.0.3
Jan 20 10:01:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:42.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.806 226890 DEBUG nova.compute.manager [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.807 226890 DEBUG oslo_concurrency.lockutils [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.807 226890 DEBUG oslo_concurrency.lockutils [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.807 226890 DEBUG oslo_concurrency.lockutils [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.807 226890 DEBUG nova.compute.manager [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:42 np0005588920 nova_compute[226886]: 2026-01-20 15:01:42.808 226890 WARNING nova.compute.manager [req-d5ac2e0f-b5be-4368-8880-ac5a487e4859 req-e8588893-2d41-478a-bdea-c140e315ccf4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:42 np0005588920 podman[281165]: 2026-01-20 15:01:42.972575658 +0000 UTC m=+0.051425900 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:01:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:43.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:44.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.549 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.868 226890 DEBUG nova.compute.manager [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.868 226890 DEBUG oslo_concurrency.lockutils [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.868 226890 DEBUG oslo_concurrency.lockutils [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.868 226890 DEBUG oslo_concurrency.lockutils [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.869 226890 DEBUG nova.compute.manager [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:44 np0005588920 nova_compute[226886]: 2026-01-20 15:01:44.869 226890 WARNING nova.compute.manager [req-126f2b22-bb63-4690-9dac-20028ab212cb req-2a73ab53-95c2-4cc9-8365-c3d345727ad3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 20 10:01:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:45.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:46.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:46 np0005588920 nova_compute[226886]: 2026-01-20 15:01:46.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:47.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:47 np0005588920 nova_compute[226886]: 2026-01-20 15:01:47.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:48.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.964 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.964 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.964 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:01:48 np0005588920 nova_compute[226886]: 2026-01-20 15:01:48.964 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:49.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:49 np0005588920 nova_compute[226886]: 2026-01-20 15:01:49.553 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:50.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.805 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating instance_info_cache with network_info: [{"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.824 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-61ae2c61-01df-4ef1-8aa3-0527a43b1798" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.824 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.825 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.863 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.863 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.864 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.864 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:01:50 np0005588920 nova_compute[226886]: 2026-01-20 15:01:50.864 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:51.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3482231016' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.298 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.381 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.382 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000008f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.559 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.561 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4127MB free_disk=20.885765075683594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.561 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.561 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 61ae2c61-01df-4ef1-8aa3-0527a43b1798 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:01:51 np0005588920 nova_compute[226886]: 2026-01-20 15:01:51.707 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:51 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2888294495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:52.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2513427491' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.234 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.240 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.262 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.284 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.284 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:52.728 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:52 np0005588920 nova_compute[226886]: 2026-01-20 15:01:52.729 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:52 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:52.729 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.184 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:53 np0005588920 nova_compute[226886]: 2026-01-20 15:01:53.185 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:01:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:53.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.033 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.034 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.034 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.034 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.034 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.036 226890 INFO nova.compute.manager [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Terminating instance#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.037 226890 DEBUG nova.compute.manager [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:01:54 np0005588920 kernel: tapee1c78ce-d0 (unregistering): left promiscuous mode
Jan 20 10:01:54 np0005588920 NetworkManager[49076]: <info>  [1768921314.0827] device (tapee1c78ce-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:54Z|00683|binding|INFO|Releasing lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 from this chassis (sb_readonly=0)
Jan 20 10:01:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:54Z|00684|binding|INFO|Setting lport ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 down in Southbound
Jan 20 10:01:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:01:54Z|00685|binding|INFO|Removing iface tapee1c78ce-d0 ovn-installed in OVS
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.094 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.101 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:76:cf 10.100.0.3'], port_security=['fa:16:3e:8c:76:cf 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '61ae2c61-01df-4ef1-8aa3-0527a43b1798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3aad5d71-9bbf-496d-805e-819d17c4343e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '105e56abe3804424885c7aa8d1216d12', 'neutron:revision_number': '11', 'neutron:security_group_ids': '5c26cf5d-4215-4bd2-8a4b-3ad6a5f65504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298e3802-e88f-473c-a925-fb8c9f7cfd27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.102 144128 INFO neutron.agent.ovn.metadata.agent [-] Port ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 in datapath 3aad5d71-9bbf-496d-805e-819d17c4343e unbound from our chassis#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.104 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3aad5d71-9bbf-496d-805e-819d17c4343e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.105 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d823b4a-4547-419f-9ffb-0da9bbf0122c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.105 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e namespace which is not needed anymore#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.118 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 20 10:01:54 np0005588920 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008f.scope: Consumed 6.830s CPU time.
Jan 20 10:01:54 np0005588920 systemd-machined[196121]: Machine qemu-68-instance-0000008f terminated.
Jan 20 10:01:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:54.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [NOTICE]   (281153) : haproxy version is 2.8.14-c23fe91
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [NOTICE]   (281153) : path to executable is /usr/sbin/haproxy
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [WARNING]  (281153) : Exiting Master process...
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [WARNING]  (281153) : Exiting Master process...
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [ALERT]    (281153) : Current worker (281155) exited with code 143 (Terminated)
Jan 20 10:01:54 np0005588920 neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e[281149]: [WARNING]  (281153) : All workers exited. Exiting... (0)
Jan 20 10:01:54 np0005588920 systemd[1]: libpod-32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81.scope: Deactivated successfully.
Jan 20 10:01:54 np0005588920 podman[281256]: 2026-01-20 15:01:54.228818853 +0000 UTC m=+0.044504291 container died 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 10:01:54 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81-userdata-shm.mount: Deactivated successfully.
Jan 20 10:01:54 np0005588920 systemd[1]: var-lib-containers-storage-overlay-cf70729eb1c3f91959e090fac6269215bbe17670d4e80b50683f47548b3ced72-merged.mount: Deactivated successfully.
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.257 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.263 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 podman[281256]: 2026-01-20 15:01:54.265405558 +0000 UTC m=+0.081090986 container cleanup 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.271 226890 INFO nova.virt.libvirt.driver [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Instance destroyed successfully.#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.272 226890 DEBUG nova.objects.instance [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lazy-loading 'resources' on Instance uuid 61ae2c61-01df-4ef1-8aa3-0527a43b1798 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:54 np0005588920 systemd[1]: libpod-conmon-32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81.scope: Deactivated successfully.
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.293 226890 DEBUG nova.virt.libvirt.vif [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1098898119',display_name='tempest-ServersNegativeTestJSON-server-1098898119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1098898119',id=143,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:01:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='105e56abe3804424885c7aa8d1216d12',ramdisk_id='',reservation_id='r-yf5960ix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1233513591',owner_user_name='tempest-ServersNegativeTestJSON-1233513591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:01:36Z,user_data=None,user_id='d77d3db3cf924683a608d10efefcd156',uuid=61ae2c61-01df-4ef1-8aa3-0527a43b1798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.293 226890 DEBUG nova.network.os_vif_util [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converting VIF {"id": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "address": "fa:16:3e:8c:76:cf", "network": {"id": "3aad5d71-9bbf-496d-805e-819d17c4343e", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1714826441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "105e56abe3804424885c7aa8d1216d12", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee1c78ce-d0", "ovs_interfaceid": "ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.294 226890 DEBUG nova.network.os_vif_util [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.295 226890 DEBUG os_vif [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.296 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee1c78ce-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.300 226890 INFO os_vif [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:76:cf,bridge_name='br-int',has_traffic_filtering=True,id=ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7,network=Network(3aad5d71-9bbf-496d-805e-819d17c4343e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee1c78ce-d0')#033[00m
Jan 20 10:01:54 np0005588920 podman[281294]: 2026-01-20 15:01:54.332830293 +0000 UTC m=+0.041391363 container remove 32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.338 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[85d6a5bc-17dc-4ce9-ace9-c93298d36062]: (4, ('Tue Jan 20 03:01:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81)\n32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81\nTue Jan 20 03:01:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e (32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81)\n32676c1e6492ef78087b361ba2284a73d69fcd4f8f8f370b0b7701ee42eeff81\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.339 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30ed391d-218e-4a4d-b563-240f2618ce2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.340 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aad5d71-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 kernel: tap3aad5d71-90: left promiscuous mode
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.358 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bda45fe3-922d-40eb-a2fd-fa1d5ab7db5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.379 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[abead289-1609-43e7-85eb-d4c0f81dfba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.380 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[415acce5-0c0c-48b3-824c-366c2f3fa95e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.397 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3da44d-b0ee-4e38-a6fa-01e9142cf1a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 633654, 'reachable_time': 31506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281324, 'error': None, 'target': 'ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3aad5d71\x2d9bbf\x2d496d\x2d805e\x2d819d17c4343e.mount: Deactivated successfully.
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.400 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3aad5d71-9bbf-496d-805e-819d17c4343e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:01:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:54.400 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[753811ab-200e-4be0-a89c-16fec1cb9d73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.838 226890 DEBUG nova.compute.manager [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.839 226890 DEBUG oslo_concurrency.lockutils [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.840 226890 DEBUG oslo_concurrency.lockutils [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.840 226890 DEBUG oslo_concurrency.lockutils [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.841 226890 DEBUG nova.compute.manager [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:54 np0005588920 nova_compute[226886]: 2026-01-20 15:01:54.841 226890 DEBUG nova.compute.manager [req-74b09329-5bb4-4ea3-98e0-dfd8311d9345 req-afd7ad17-36f5-4d0a-b4a3-d0ac36dc345e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-unplugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.064 226890 INFO nova.virt.libvirt.driver [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Deleting instance files /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798_del#033[00m
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.065 226890 INFO nova.virt.libvirt.driver [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Deletion of /var/lib/nova/instances/61ae2c61-01df-4ef1-8aa3-0527a43b1798_del complete#033[00m
Jan 20 10:01:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:55.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.249 226890 INFO nova.compute.manager [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.250 226890 DEBUG oslo.service.loopingcall [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.251 226890 DEBUG nova.compute.manager [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:01:55 np0005588920 nova_compute[226886]: 2026-01-20 15:01:55.251 226890 DEBUG nova.network.neutron [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:01:55 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:01:55.731 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:56.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.959 226890 DEBUG nova.compute.manager [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.959 226890 DEBUG oslo_concurrency.lockutils [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.959 226890 DEBUG oslo_concurrency.lockutils [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.960 226890 DEBUG oslo_concurrency.lockutils [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.960 226890 DEBUG nova.compute.manager [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] No waiting events found dispatching network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.960 226890 WARNING nova.compute.manager [req-d0b83272-6fa3-493a-a79a-4f6e9b002fa3 req-a401c2a8-1e38-463b-98a2-bd5933803f2c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received unexpected event network-vif-plugged-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:01:56 np0005588920 nova_compute[226886]: 2026-01-20 15:01:56.992 226890 DEBUG nova.network.neutron [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.014 226890 INFO nova.compute.manager [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Took 1.76 seconds to deallocate network for instance.#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.076 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.076 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.091 226890 DEBUG nova.compute.manager [req-0c2d4ebc-9805-46ee-ad06-b5f1a5ead999 req-7626d102-a146-49ce-99fc-733eb9ac2011 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Received event network-vif-deleted-ee1c78ce-d0fd-4b6b-8a7c-e3aff97e74d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.148 226890 DEBUG oslo_concurrency.processutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:57.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2765111203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.571 226890 DEBUG oslo_concurrency.processutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.579 226890 DEBUG nova.compute.provider_tree [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.600 226890 DEBUG nova.scheduler.client.report [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.623 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.653 226890 INFO nova.scheduler.client.report [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Deleted allocations for instance 61ae2c61-01df-4ef1-8aa3-0527a43b1798#033[00m
Jan 20 10:01:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:57 np0005588920 nova_compute[226886]: 2026-01-20 15:01:57.735 226890 DEBUG oslo_concurrency.lockutils [None req-9e25e724-923a-422d-8eda-837ded4bdcc9 d77d3db3cf924683a608d10efefcd156 105e56abe3804424885c7aa8d1216d12 - - default default] Lock "61ae2c61-01df-4ef1-8aa3-0527a43b1798" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:58.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:01:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:59.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:59 np0005588920 nova_compute[226886]: 2026-01-20 15:01:59.300 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:00.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:01.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:01 np0005588920 nova_compute[226886]: 2026-01-20 15:02:01.445 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:02.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:03.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:04 np0005588920 podman[281348]: 2026-01-20 15:02:04.040386898 +0000 UTC m=+0.126344898 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:02:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:04.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.530 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.531 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.564 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.643 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.644 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.651 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.651 226890 INFO nova.compute.claims [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:02:04 np0005588920 nova_compute[226886]: 2026-01-20 15:02:04.784 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1790087133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.224 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.231 226890 DEBUG nova.compute.provider_tree [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:05.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.252 226890 DEBUG nova.scheduler.client.report [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.284 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.285 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.370 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.370 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.389 226890 INFO nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.415 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.518 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.519 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.520 226890 INFO nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Creating image(s)#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.542 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.570 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.591 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.594 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.660 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.660 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.661 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.661 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.683 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.686 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:05 np0005588920 nova_compute[226886]: 2026-01-20 15:02:05.982 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.061 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] resizing rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.157 226890 DEBUG nova.objects.instance [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'migration_context' on Instance uuid 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:06.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.194 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.195 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Ensure instance console log exists: /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.195 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.196 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.196 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:06 np0005588920 nova_compute[226886]: 2026-01-20 15:02:06.448 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:06 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 20 10:02:07 np0005588920 nova_compute[226886]: 2026-01-20 15:02:07.056 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Successfully created port: 98da0555-6cf7-43cd-8528-cb59391f5674 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:02:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:07.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:08.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.324 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Successfully updated port: 98da0555-6cf7-43cd-8528-cb59391f5674 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.355 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.355 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquired lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.356 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.485 226890 DEBUG nova.compute.manager [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-changed-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.485 226890 DEBUG nova.compute.manager [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Refreshing instance network info cache due to event network-changed-98da0555-6cf7-43cd-8528-cb59391f5674. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.486 226890 DEBUG oslo_concurrency.lockutils [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:08 np0005588920 nova_compute[226886]: 2026-01-20 15:02:08.569 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:02:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:09.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:09 np0005588920 nova_compute[226886]: 2026-01-20 15:02:09.268 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921314.2675498, 61ae2c61-01df-4ef1-8aa3-0527a43b1798 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:09 np0005588920 nova_compute[226886]: 2026-01-20 15:02:09.269 226890 INFO nova.compute.manager [-] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:02:09 np0005588920 nova_compute[226886]: 2026-01-20 15:02:09.293 226890 DEBUG nova.compute.manager [None req-16d01c61-a4b9-4720-b524-1d8c95b3a8a4 - - - - - -] [instance: 61ae2c61-01df-4ef1-8aa3-0527a43b1798] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:09 np0005588920 nova_compute[226886]: 2026-01-20 15:02:09.305 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:10.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.284 226890 DEBUG nova.network.neutron [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Updating instance_info_cache with network_info: [{"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.354 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Releasing lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.354 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Instance network_info: |[{"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.355 226890 DEBUG oslo_concurrency.lockutils [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.355 226890 DEBUG nova.network.neutron [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Refreshing network info cache for port 98da0555-6cf7-43cd-8528-cb59391f5674 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.358 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Start _get_guest_xml network_info=[{"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.363 226890 WARNING nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.375 226890 DEBUG nova.virt.libvirt.host [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.377 226890 DEBUG nova.virt.libvirt.host [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.388 226890 DEBUG nova.virt.libvirt.host [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.389 226890 DEBUG nova.virt.libvirt.host [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.390 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.390 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.391 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.391 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.392 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.392 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.392 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.393 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.393 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.393 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.393 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.394 226890 DEBUG nova.virt.hardware [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.396 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1448665557' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.862 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.891 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:10 np0005588920 nova_compute[226886]: 2026-01-20 15:02:10.895 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:11.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3771773152' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.348 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.350 226890 DEBUG nova.virt.libvirt.vif [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-995385633',display_name='tempest-TestServerMultinode-server-995385633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-995385633',id=153,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-uzo0s30g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:05Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=18f2cf64-c2d5-4f0b-a16e-48cf2f558c10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.350 226890 DEBUG nova.network.os_vif_util [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.351 226890 DEBUG nova.network.os_vif_util [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.353 226890 DEBUG nova.objects.instance [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'pci_devices' on Instance uuid 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.370 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <uuid>18f2cf64-c2d5-4f0b-a16e-48cf2f558c10</uuid>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <name>instance-00000099</name>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestServerMultinode-server-995385633</nova:name>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:02:10</nova:creationTime>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:user uuid="158563a99d4a420890aaa00b05c8bb57">tempest-TestServerMultinode-1071973011-project-admin</nova:user>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:project uuid="654b3ce7b3644fc58f8dc9f60529320b">tempest-TestServerMultinode-1071973011</nova:project>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <nova:port uuid="98da0555-6cf7-43cd-8528-cb59391f5674">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="serial">18f2cf64-c2d5-4f0b-a16e-48cf2f558c10</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="uuid">18f2cf64-c2d5-4f0b-a16e-48cf2f558c10</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:f3:d9:29"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <target dev="tap98da0555-6c"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/console.log" append="off"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:02:11 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:02:11 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:02:11 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:02:11 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.371 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Preparing to wait for external event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.372 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.372 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.372 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.373 226890 DEBUG nova.virt.libvirt.vif [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-995385633',display_name='tempest-TestServerMultinode-server-995385633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-995385633',id=153,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-uzo0s30g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:05Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=18f2cf64-c2d5-4f0b-a16e-48cf2f558c10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.373 226890 DEBUG nova.network.os_vif_util [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.374 226890 DEBUG nova.network.os_vif_util [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.374 226890 DEBUG os_vif [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.376 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.376 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.379 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98da0555-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.379 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap98da0555-6c, col_values=(('external_ids', {'iface-id': '98da0555-6cf7-43cd-8528-cb59391f5674', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:d9:29', 'vm-uuid': '18f2cf64-c2d5-4f0b-a16e-48cf2f558c10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588920 NetworkManager[49076]: <info>  [1768921331.3815] manager: (tap98da0555-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.386 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.387 226890 INFO os_vif [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c')#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.448 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.476 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.476 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.476 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No VIF found with MAC fa:16:3e:f3:d9:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.477 226890 INFO nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Using config drive#033[00m
Jan 20 10:02:11 np0005588920 nova_compute[226886]: 2026-01-20 15:02:11.502 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:12.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.554 226890 INFO nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Creating config drive at /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.558 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwozynqn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.689 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwozynqn" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.721 226890 DEBUG nova.storage.rbd_utils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.725 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.809 226890 DEBUG nova.network.neutron [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Updated VIF entry in instance network info cache for port 98da0555-6cf7-43cd-8528-cb59391f5674. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.810 226890 DEBUG nova.network.neutron [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Updating instance_info_cache with network_info: [{"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.839 226890 DEBUG oslo_concurrency.lockutils [req-4c87b232-3539-469f-a9bf-ea90f3cc027b req-846a4539-98ea-4626-a236-9e1b6a41ae57 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.903 226890 DEBUG oslo_concurrency.processutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.903 226890 INFO nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Deleting local config drive /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10/disk.config because it was imported into RBD.#033[00m
Jan 20 10:02:12 np0005588920 kernel: tap98da0555-6c: entered promiscuous mode
Jan 20 10:02:12 np0005588920 NetworkManager[49076]: <info>  [1768921332.9474] manager: (tap98da0555-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Jan 20 10:02:12 np0005588920 nova_compute[226886]: 2026-01-20 15:02:12.947 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:12 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:12Z|00686|binding|INFO|Claiming lport 98da0555-6cf7-43cd-8528-cb59391f5674 for this chassis.
Jan 20 10:02:12 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:12Z|00687|binding|INFO|98da0555-6cf7-43cd-8528-cb59391f5674: Claiming fa:16:3e:f3:d9:29 10.100.0.3
Jan 20 10:02:12 np0005588920 systemd-machined[196121]: New machine qemu-69-instance-00000099.
Jan 20 10:02:12 np0005588920 systemd-udevd[281698]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:02:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:12.984 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:d9:29 10.100.0.3'], port_security=['fa:16:3e:f3:d9:29 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '18f2cf64-c2d5-4f0b-a16e-48cf2f558c10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '654b3ce7b3644fc58f8dc9f60529320b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff1c5b6a-5ab6-401e-b333-7f359193e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a3d5928-255d-4c0c-af70-f26be5196416, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=98da0555-6cf7-43cd-8528-cb59391f5674) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:12.985 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 98da0555-6cf7-43cd-8528-cb59391f5674 in datapath 0296a21f-6ec4-43a7-8731-1d3692a5de4a bound to our chassis#033[00m
Jan 20 10:02:12 np0005588920 NetworkManager[49076]: <info>  [1768921332.9877] device (tap98da0555-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:02:12 np0005588920 NetworkManager[49076]: <info>  [1768921332.9886] device (tap98da0555-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:02:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:12.987 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0296a21f-6ec4-43a7-8731-1d3692a5de4a#033[00m
Jan 20 10:02:12 np0005588920 systemd[1]: Started Virtual Machine qemu-69-instance-00000099.
Jan 20 10:02:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:12.997 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[06866053-c704-47e0-9c3c-587e8297e9fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:12.998 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0296a21f-61 in ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.000 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0296a21f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.000 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cd069be0-2fed-4f69-94e3-aaf0ed4ca033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.001 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d04b64-3b02-4165-8843-1ca43259d2a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.011 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0d2e3f-bf32-4462-b0bb-1cf2967d555a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:13Z|00688|binding|INFO|Setting lport 98da0555-6cf7-43cd-8528-cb59391f5674 ovn-installed in OVS
Jan 20 10:02:13 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:13Z|00689|binding|INFO|Setting lport 98da0555-6cf7-43cd-8528-cb59391f5674 up in Southbound
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.063 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e107008-64ad-4dec-8081-b6a28c7890d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.092 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6e0c0e52-b2e4-4e68-b327-2b0eebd6cdd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 podman[281702]: 2026-01-20 15:02:13.092909814 +0000 UTC m=+0.081898259 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 20 10:02:13 np0005588920 systemd-udevd[281700]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.097 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3ed2ee-a454-4b97-90c2-5f77ac314d92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 NetworkManager[49076]: <info>  [1768921333.0989] manager: (tap0296a21f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/334)
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.132 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9b04d78e-bc72-4399-b422-bef03a7500b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.135 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4c425a30-0f56-45a9-809d-15be08e4b432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 NetworkManager[49076]: <info>  [1768921333.1567] device (tap0296a21f-60): carrier: link connected
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.161 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[991134c7-a09f-4321-acbb-ac2b584c0462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.177 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[33cea53b-581f-467d-b46a-11695d5a4e56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0296a21f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:1c:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637484, 'reachable_time': 40558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281749, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.192 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b07c9a85-590b-4807-85c5-d8b39334210e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:1c68'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637484, 'tstamp': 637484}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281750, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.210 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[31a2c9df-cede-4253-ad8c-65947427004c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0296a21f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:1c:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637484, 'reachable_time': 40558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281751, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.241 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ad86680a-630a-457e-85eb-4d7b83930700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:13.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.297 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[76d72290-6dba-4756-b4f9-63151ba1ac98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.298 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0296a21f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.299 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.299 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0296a21f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:13 np0005588920 kernel: tap0296a21f-60: entered promiscuous mode
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.300 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 NetworkManager[49076]: <info>  [1768921333.3012] manager: (tap0296a21f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.303 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.304 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0296a21f-60, col_values=(('external_ids', {'iface-id': 'a6fccd00-2fdb-4d49-8d76-4860c81e4a5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.305 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:13Z|00690|binding|INFO|Releasing lport a6fccd00-2fdb-4d49-8d76-4860c81e4a5f from this chassis (sb_readonly=0)
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.307 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.308 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.316 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4b1f65-0fac-493e-a7d4-79b48af6ef86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.317 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-0296a21f-6ec4-43a7-8731-1d3692a5de4a
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 0296a21f-6ec4-43a7-8731-1d3692a5de4a
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:02:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:13.317 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'env', 'PROCESS_TAG=haproxy-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0296a21f-6ec4-43a7-8731-1d3692a5de4a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.540 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921333.5402594, 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.540 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] VM Started (Lifecycle Event)#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.572 226890 DEBUG nova.compute.manager [req-b61d85bb-0ed4-43ae-95f0-3f94c5c9f4ab req-5eb75fa5-6901-4c3f-8d80-01b13fba9673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.573 226890 DEBUG oslo_concurrency.lockutils [req-b61d85bb-0ed4-43ae-95f0-3f94c5c9f4ab req-5eb75fa5-6901-4c3f-8d80-01b13fba9673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.573 226890 DEBUG oslo_concurrency.lockutils [req-b61d85bb-0ed4-43ae-95f0-3f94c5c9f4ab req-5eb75fa5-6901-4c3f-8d80-01b13fba9673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.573 226890 DEBUG oslo_concurrency.lockutils [req-b61d85bb-0ed4-43ae-95f0-3f94c5c9f4ab req-5eb75fa5-6901-4c3f-8d80-01b13fba9673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.574 226890 DEBUG nova.compute.manager [req-b61d85bb-0ed4-43ae-95f0-3f94c5c9f4ab req-5eb75fa5-6901-4c3f-8d80-01b13fba9673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Processing event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.575 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.576 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.581 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.583 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.588 226890 INFO nova.virt.libvirt.driver [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Instance spawned successfully.#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.588 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:02:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:02:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3642955436' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:02:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:02:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3642955436' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.630 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.631 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921333.542431, 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.631 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.637 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.637 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.638 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.638 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.639 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.639 226890 DEBUG nova.virt.libvirt.driver [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.667 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.671 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921333.5806146, 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.672 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:02:13 np0005588920 podman[281825]: 2026-01-20 15:02:13.69563254 +0000 UTC m=+0.045236402 container create 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.702 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.709 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.715 226890 INFO nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Took 8.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.717 226890 DEBUG nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.729 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:13 np0005588920 systemd[1]: Started libpod-conmon-231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d.scope.
Jan 20 10:02:13 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:02:13 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17eada987acc79465a7de7d2d593d678a16573d5b651f192aae2ddd9ff0f0471/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:02:13 np0005588920 podman[281825]: 2026-01-20 15:02:13.671373008 +0000 UTC m=+0.020976900 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:02:13 np0005588920 podman[281825]: 2026-01-20 15:02:13.774832651 +0000 UTC m=+0.124436543 container init 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:02:13 np0005588920 podman[281825]: 2026-01-20 15:02:13.780653588 +0000 UTC m=+0.130257450 container start 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.783 226890 INFO nova.compute.manager [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Took 9.17 seconds to build instance.#033[00m
Jan 20 10:02:13 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [NOTICE]   (281844) : New worker (281846) forked
Jan 20 10:02:13 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [NOTICE]   (281844) : Loading success.
Jan 20 10:02:13 np0005588920 nova_compute[226886]: 2026-01-20 15:02:13.804 226890 DEBUG oslo_concurrency.lockutils [None req-3f7dce95-1849-415d-84d9-6a657dbcdf81 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:14.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:14 np0005588920 podman[282023]: 2026-01-20 15:02:14.678677334 +0000 UTC m=+0.074992242 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:02:14 np0005588920 podman[282023]: 2026-01-20 15:02:14.782627262 +0000 UTC m=+0.178942170 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 10:02:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:15.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.700 226890 DEBUG nova.compute.manager [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.702 226890 DEBUG oslo_concurrency.lockutils [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.703 226890 DEBUG oslo_concurrency.lockutils [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.703 226890 DEBUG oslo_concurrency.lockutils [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.703 226890 DEBUG nova.compute.manager [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] No waiting events found dispatching network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:15 np0005588920 nova_compute[226886]: 2026-01-20 15:02:15.703 226890 WARNING nova.compute.manager [req-19f37828-ae95-479e-b522-76130adc86f3 req-86850f08-e050-4cfc-9c4c-f54f5b9a3a01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received unexpected event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:02:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:16.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588920 nova_compute[226886]: 2026-01-20 15:02:16.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:16 np0005588920 nova_compute[226886]: 2026-01-20 15:02:16.452 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:16.463 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:16.464 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:16.464 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:17.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:02:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:17 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:02:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:18.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:19.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:20.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:21.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:21 np0005588920 nova_compute[226886]: 2026-01-20 15:02:21.386 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 20 10:02:21 np0005588920 nova_compute[226886]: 2026-01-20 15:02:21.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:22.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 20 10:02:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:23.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.803 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.804 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.804 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.805 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.805 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.807 226890 INFO nova.compute.manager [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Terminating instance#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.808 226890 DEBUG nova.compute.manager [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:02:23 np0005588920 kernel: tap98da0555-6c (unregistering): left promiscuous mode
Jan 20 10:02:23 np0005588920 NetworkManager[49076]: <info>  [1768921343.8557] device (tap98da0555-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.913 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:23Z|00691|binding|INFO|Releasing lport 98da0555-6cf7-43cd-8528-cb59391f5674 from this chassis (sb_readonly=0)
Jan 20 10:02:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:23Z|00692|binding|INFO|Setting lport 98da0555-6cf7-43cd-8528-cb59391f5674 down in Southbound
Jan 20 10:02:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:23Z|00693|binding|INFO|Removing iface tap98da0555-6c ovn-installed in OVS
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.915 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:23.921 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:d9:29 10.100.0.3'], port_security=['fa:16:3e:f3:d9:29 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '18f2cf64-c2d5-4f0b-a16e-48cf2f558c10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '654b3ce7b3644fc58f8dc9f60529320b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff1c5b6a-5ab6-401e-b333-7f359193e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a3d5928-255d-4c0c-af70-f26be5196416, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=98da0555-6cf7-43cd-8528-cb59391f5674) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:23.922 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 98da0555-6cf7-43cd-8528-cb59391f5674 in datapath 0296a21f-6ec4-43a7-8731-1d3692a5de4a unbound from our chassis#033[00m
Jan 20 10:02:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:23.923 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0296a21f-6ec4-43a7-8731-1d3692a5de4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:02:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:23.925 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3751f5bd-550d-413d-89cf-970bbc9c2d3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:23.925 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a namespace which is not needed anymore#033[00m
Jan 20 10:02:23 np0005588920 nova_compute[226886]: 2026-01-20 15:02:23.934 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:23 np0005588920 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000099.scope: Deactivated successfully.
Jan 20 10:02:23 np0005588920 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000099.scope: Consumed 10.997s CPU time.
Jan 20 10:02:23 np0005588920 systemd-machined[196121]: Machine qemu-69-instance-00000099 terminated.
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.029 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.043 226890 INFO nova.virt.libvirt.driver [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Instance destroyed successfully.#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.044 226890 DEBUG nova.objects.instance [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'resources' on Instance uuid 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.063 226890 DEBUG nova.virt.libvirt.vif [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-995385633',display_name='tempest-TestServerMultinode-server-995385633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-995385633',id=153,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-uzo0s30g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:02:13Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=18f2cf64-c2d5-4f0b-a16e-48cf2f558c10,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.064 226890 DEBUG nova.network.os_vif_util [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "98da0555-6cf7-43cd-8528-cb59391f5674", "address": "fa:16:3e:f3:d9:29", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98da0555-6c", "ovs_interfaceid": "98da0555-6cf7-43cd-8528-cb59391f5674", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.065 226890 DEBUG nova.network.os_vif_util [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.065 226890 DEBUG os_vif [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.066 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.067 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98da0555-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [NOTICE]   (281844) : haproxy version is 2.8.14-c23fe91
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [NOTICE]   (281844) : path to executable is /usr/sbin/haproxy
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [WARNING]  (281844) : Exiting Master process...
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [WARNING]  (281844) : Exiting Master process...
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [ALERT]    (281844) : Current worker (281846) exited with code 143 (Terminated)
Jan 20 10:02:24 np0005588920 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[281839]: [WARNING]  (281844) : All workers exited. Exiting... (0)
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.074 226890 INFO os_vif [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d9:29,bridge_name='br-int',has_traffic_filtering=True,id=98da0555-6cf7-43cd-8528-cb59391f5674,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98da0555-6c')#033[00m
Jan 20 10:02:24 np0005588920 systemd[1]: libpod-231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d.scope: Deactivated successfully.
Jan 20 10:02:24 np0005588920 podman[282350]: 2026-01-20 15:02:24.085019999 +0000 UTC m=+0.053685333 container died 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:02:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d-userdata-shm.mount: Deactivated successfully.
Jan 20 10:02:24 np0005588920 systemd[1]: var-lib-containers-storage-overlay-17eada987acc79465a7de7d2d593d678a16573d5b651f192aae2ddd9ff0f0471-merged.mount: Deactivated successfully.
Jan 20 10:02:24 np0005588920 podman[282350]: 2026-01-20 15:02:24.129715285 +0000 UTC m=+0.098380599 container cleanup 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:02:24 np0005588920 systemd[1]: libpod-conmon-231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d.scope: Deactivated successfully.
Jan 20 10:02:24 np0005588920 podman[282409]: 2026-01-20 15:02:24.195006239 +0000 UTC m=+0.042912776 container remove 231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.200 226890 DEBUG nova.compute.manager [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-unplugged-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.201 226890 DEBUG oslo_concurrency.lockutils [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.201 226890 DEBUG oslo_concurrency.lockutils [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.201 226890 DEBUG oslo_concurrency.lockutils [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.201 226890 DEBUG nova.compute.manager [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] No waiting events found dispatching network-vif-unplugged-98da0555-6cf7-43cd-8528-cb59391f5674 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.202 226890 DEBUG nova.compute.manager [req-eab1c83e-0d9b-42a4-9e41-3723fdd9909c req-709f9b3a-57fc-41d2-a779-c90bf3676d81 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-unplugged-98da0555-6cf7-43cd-8528-cb59391f5674 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.201 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f5702d9d-ca28-4771-b75d-b04111af54bc]: (4, ('Tue Jan 20 03:02:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a (231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d)\n231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d\nTue Jan 20 03:02:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a (231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d)\n231fec163cee61ce8b8727dc0b5751d1233a5b0369268aa6a7b2b09d32d88f8d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.203 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[96ae3911-6ab3-4287-9db7-9911482cbe39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.204 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0296a21f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.205 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 kernel: tap0296a21f-60: left promiscuous mode
Jan 20 10:02:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:24.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 nova_compute[226886]: 2026-01-20 15:02:24.220 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.221 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7241995d-0c12-4ea3-9936-8dff6e9a1682]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.238 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[87d8e8ba-7255-47e9-bdc3-33110691db43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.240 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1d644070-0076-4395-a30e-53a5b5f34c80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.255 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[37f541f6-ab86-4751-b567-63ff04848efd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637477, 'reachable_time': 20857, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282424, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.257 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:02:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:24.258 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[2c10a3fc-80fd-4ee9-a270-6f420611cfe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:24 np0005588920 systemd[1]: run-netns-ovnmeta\x2d0296a21f\x2d6ec4\x2d43a7\x2d8731\x2d1d3692a5de4a.mount: Deactivated successfully.
Jan 20 10:02:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:24 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:25.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:25 np0005588920 nova_compute[226886]: 2026-01-20 15:02:25.990 226890 INFO nova.virt.libvirt.driver [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Deleting instance files /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_del#033[00m
Jan 20 10:02:25 np0005588920 nova_compute[226886]: 2026-01-20 15:02:25.990 226890 INFO nova.virt.libvirt.driver [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Deletion of /var/lib/nova/instances/18f2cf64-c2d5-4f0b-a16e-48cf2f558c10_del complete#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.036 226890 INFO nova.compute.manager [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Took 2.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.037 226890 DEBUG oslo.service.loopingcall [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.037 226890 DEBUG nova.compute.manager [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.037 226890 DEBUG nova.network.neutron [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:02:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:26.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.283 226890 DEBUG nova.compute.manager [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.284 226890 DEBUG oslo_concurrency.lockutils [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.284 226890 DEBUG oslo_concurrency.lockutils [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.284 226890 DEBUG oslo_concurrency.lockutils [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.284 226890 DEBUG nova.compute.manager [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] No waiting events found dispatching network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.284 226890 WARNING nova.compute.manager [req-1143712f-e818-4c12-9c6f-a6a82ac2599e req-a18f4e88-7342-4704-b706-d9e747745767 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received unexpected event network-vif-plugged-98da0555-6cf7-43cd-8528-cb59391f5674 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:02:26 np0005588920 nova_compute[226886]: 2026-01-20 15:02:26.457 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.032 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.032 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.056 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.139 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.140 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.148 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.149 226890 INFO nova.compute.claims [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:02:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:27.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.271 226890 DEBUG nova.network.neutron [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.288 226890 INFO nova.compute.manager [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.343 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.374 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.426 226890 DEBUG nova.compute.manager [req-ee903b3e-b5e2-4b37-9409-6dadc12f4758 req-f1d73400-6a69-4945-a6b2-c2d690288f59 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Received event network-vif-deleted-98da0555-6cf7-43cd-8528-cb59391f5674 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e336 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/841351583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.839 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.847 226890 DEBUG nova.compute.provider_tree [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.863 226890 DEBUG nova.scheduler.client.report [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.887 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.888 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.891 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.947 226890 DEBUG oslo_concurrency.processutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.982 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:02:27 np0005588920 nova_compute[226886]: 2026-01-20 15:02:27.983 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.006 226890 INFO nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.028 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.113 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.115 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.116 226890 INFO nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Creating image(s)#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.142 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.168 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.195 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.199 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:28.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.272 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.273 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.273 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.274 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.306 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.309 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 75368220-ff38-456b-a0e6-ae1c02625514_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3390699459' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.430 226890 DEBUG oslo_concurrency.processutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.436 226890 DEBUG nova.compute.provider_tree [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.459 226890 DEBUG nova.scheduler.client.report [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.481 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.513 226890 INFO nova.scheduler.client.report [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Deleted allocations for instance 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.608 226890 DEBUG nova.policy [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '912329b1a6ad42bdb72e952c03983bdf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96f7b14c2a9348f08305fe232df2a603', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.613 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 75368220-ff38-456b-a0e6-ae1c02625514_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.304s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.653 226890 DEBUG oslo_concurrency.lockutils [None req-aca3ea61-9863-47e7-9352-be122317ecf6 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "18f2cf64-c2d5-4f0b-a16e-48cf2f558c10" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.704 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] resizing rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.843 226890 DEBUG nova.objects.instance [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'migration_context' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.858 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.858 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Ensure instance console log exists: /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.859 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.859 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.859 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.914 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.914 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.930 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.988 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.989 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.994 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:02:28 np0005588920 nova_compute[226886]: 2026-01-20 15:02:28.994 226890 INFO nova.compute.claims [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.126 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:29.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.287 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Successfully created port: 27ba7c79-863a-4084-a5df-ee7a70ec6e0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:02:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3652761978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.604 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.611 226890 DEBUG nova.compute.provider_tree [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.636 226890 DEBUG nova.scheduler.client.report [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.663 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.664 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.718 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.718 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.751 226890 INFO nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.785 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.881 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.883 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.884 226890 INFO nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Creating image(s)#033[00m
Jan 20 10:02:29 np0005588920 nova_compute[226886]: 2026-01-20 15:02:29.909 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.009 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.031 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.035 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "40b133d2dcbcba45d41e32c281e6ed8df52446f8" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.036 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "40b133d2dcbcba45d41e32c281e6ed8df52446f8" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.041 226890 DEBUG nova.policy [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1654794111844ca88666b3529173e9a7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a1d679d5c954662a271e842fe2f2c05', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.194 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Successfully updated port: 27ba7c79-863a-4084-a5df-ee7a70ec6e0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.213 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.213 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.214 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:02:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:30.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.294 226890 DEBUG nova.compute.manager [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-changed-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.295 226890 DEBUG nova.compute.manager [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Refreshing instance network info cache due to event network-changed-27ba7c79-863a-4084-a5df-ee7a70ec6e0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.295 226890 DEBUG oslo_concurrency.lockutils [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.324 226890 DEBUG nova.virt.libvirt.imagebackend [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/97fb0fa0-6803-480b-96d2-4a219153376d/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/97fb0fa0-6803-480b-96d2-4a219153376d/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.427 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.433 226890 DEBUG nova.virt.libvirt.imagebackend [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/97fb0fa0-6803-480b-96d2-4a219153376d/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.434 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] cloning images/97fb0fa0-6803-480b-96d2-4a219153376d@snap to None/c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.543 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "40b133d2dcbcba45d41e32c281e6ed8df52446f8" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.695 226890 DEBUG nova.objects.instance [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'migration_context' on Instance uuid c2a7aae5-0ef8-400a-acfe-2fbf83144560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.718 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.719 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Ensure instance console log exists: /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.719 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.719 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:30 np0005588920 nova_compute[226886]: 2026-01-20 15:02:30.720 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.164 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Successfully created port: 4c88fb15-8276-4e15-8d48-e7ff7412f9be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:02:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:31.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.367 226890 DEBUG nova.network.neutron [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.400 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.400 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance network_info: |[{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.401 226890 DEBUG oslo_concurrency.lockutils [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.401 226890 DEBUG nova.network.neutron [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Refreshing network info cache for port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.403 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start _get_guest_xml network_info=[{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.407 226890 WARNING nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.411 226890 DEBUG nova.virt.libvirt.host [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.411 226890 DEBUG nova.virt.libvirt.host [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.414 226890 DEBUG nova.virt.libvirt.host [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.414 226890 DEBUG nova.virt.libvirt.host [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.415 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.415 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.416 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.416 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.416 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.416 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.417 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.417 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.417 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.417 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.417 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.418 226890 DEBUG nova.virt.hardware [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.420 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.458 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3110069501' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.874 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.901 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:31 np0005588920 nova_compute[226886]: 2026-01-20 15:02:31.905 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:32.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2716342833' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.349 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.351 226890 DEBUG nova.virt.libvirt.vif [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.352 226890 DEBUG nova.network.os_vif_util [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.353 226890 DEBUG nova.network.os_vif_util [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.358 226890 DEBUG nova.objects.instance [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.373 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <uuid>75368220-ff38-456b-a0e6-ae1c02625514</uuid>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <name>instance-0000009b</name>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeTestJSON-server-284183767</nova:name>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:02:31</nova:creationTime>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:user uuid="912329b1a6ad42bdb72e952c03983bdf">tempest-AttachVolumeTestJSON-583320363-project-member</nova:user>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:project uuid="96f7b14c2a9348f08305fe232df2a603">tempest-AttachVolumeTestJSON-583320363</nova:project>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <nova:port uuid="27ba7c79-863a-4084-a5df-ee7a70ec6e0d">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="serial">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="uuid">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk.config">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:3d:87:66"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <target dev="tap27ba7c79-86"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/console.log" append="off"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:02:32 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:02:32 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:02:32 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:02:32 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.375 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Preparing to wait for external event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.375 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.375 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.375 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.376 226890 DEBUG nova.virt.libvirt.vif [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.376 226890 DEBUG nova.network.os_vif_util [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.377 226890 DEBUG nova.network.os_vif_util [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.377 226890 DEBUG os_vif [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.378 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.378 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.379 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.381 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.381 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27ba7c79-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.381 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27ba7c79-86, col_values=(('external_ids', {'iface-id': '27ba7c79-863a-4084-a5df-ee7a70ec6e0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:87:66', 'vm-uuid': '75368220-ff38-456b-a0e6-ae1c02625514'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:32 np0005588920 NetworkManager[49076]: <info>  [1768921352.3842] manager: (tap27ba7c79-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.388 226890 INFO os_vif [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.443 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.444 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.444 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No VIF found with MAC fa:16:3e:3d:87:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.445 226890 INFO nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Using config drive#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.491 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.802 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Successfully updated port: 4c88fb15-8276-4e15-8d48-e7ff7412f9be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.820 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.821 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquired lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.821 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.843 226890 INFO nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Creating config drive at /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.849 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8fjoudml execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.897 226890 DEBUG nova.compute.manager [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.898 226890 DEBUG nova.compute.manager [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing instance network info cache due to event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.898 226890 DEBUG oslo_concurrency.lockutils [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:32 np0005588920 nova_compute[226886]: 2026-01-20 15:02:32.983 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8fjoudml" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.017 226890 DEBUG nova.storage.rbd_utils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 75368220-ff38-456b-a0e6-ae1c02625514_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.022 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config 75368220-ff38-456b-a0e6-ae1c02625514_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:33.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.566 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.643 226890 DEBUG oslo_concurrency.processutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config 75368220-ff38-456b-a0e6-ae1c02625514_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.621s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.644 226890 INFO nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Deleting local config drive /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/disk.config because it was imported into RBD.#033[00m
Jan 20 10:02:33 np0005588920 kernel: tap27ba7c79-86: entered promiscuous mode
Jan 20 10:02:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:33Z|00694|binding|INFO|Claiming lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d for this chassis.
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.6934] manager: (tap27ba7c79-86): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Jan 20 10:02:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:33Z|00695|binding|INFO|27ba7c79-863a-4084-a5df-ee7a70ec6e0d: Claiming fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.697 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.706 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.708 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a bound to our chassis#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.709 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a#033[00m
Jan 20 10:02:33 np0005588920 systemd-udevd[282971]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.720 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0f92aa94-c2d7-4881-a5da-b6eb1e450515]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.721 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89fdd65f-31 in ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.722 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89fdd65f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.723 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6d7a18-73d2-4f7f-b400-a91a821bd087]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.723 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbf7001-73cc-4cf1-969e-2b52a4f7bdca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 systemd-machined[196121]: New machine qemu-70-instance-0000009b.
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.7309] device (tap27ba7c79-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.7321] device (tap27ba7c79-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.734 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c7e36a-0aa2-41ee-8a4b-365baa80ecf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.758 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2004e71d-bfbb-4025-aa62-4ce5b2cfa76e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 systemd[1]: Started Virtual Machine qemu-70-instance-0000009b.
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.764 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:33Z|00696|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d ovn-installed in OVS
Jan 20 10:02:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:33Z|00697|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d up in Southbound
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.772 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.787 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[28e1438d-5a00-4dc5-b99c-116b45a473c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.794 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4009ca3d-07ba-4f07-9f6a-67975782df43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.7947] manager: (tap89fdd65f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/338)
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.828 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b74910f6-5c1a-4b5e-b244-b1b0d9118d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.831 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[62e9178c-f487-4a39-964a-cedcf7c5b6ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.8539] device (tap89fdd65f-30): carrier: link connected
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.859 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c6ed3d-84b7-4c38-9b25-00d7ff01e011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.873 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f48df1d1-dfac-4ec4-8ef1-68fbda8182bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639554, 'reachable_time': 27326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283004, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.888 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8dca9f90-41cb-4683-afba-0dbd6780ed4b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:d33d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639554, 'tstamp': 639554}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283005, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.897 226890 DEBUG nova.network.neutron [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updated VIF entry in instance network info cache for port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.898 226890 DEBUG nova.network.neutron [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.904 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[51017b00-bfd9-4adb-8c55-51c462ffc464]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639554, 'reachable_time': 27326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283006, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.922 226890 DEBUG oslo_concurrency.lockutils [req-89dd1231-cbaa-496a-8b20-7f123604700f req-65b34c98-6db8-467e-ac7c-0c7d1db75167 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.939 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d42c7570-a260-4b74-a608-726fb3a617f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.993 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[efa11416-b1f0-40e6-893f-b6d2f179a068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.994 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.994 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:33.994 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89fdd65f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:33 np0005588920 NetworkManager[49076]: <info>  [1768921353.9969] manager: (tap89fdd65f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 20 10:02:33 np0005588920 kernel: tap89fdd65f-30: entered promiscuous mode
Jan 20 10:02:33 np0005588920 nova_compute[226886]: 2026-01-20 15:02:33.999 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.000 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89fdd65f-30, col_values=(('external_ids', {'iface-id': '58f1013f-2d8d-46a7-97e6-2062537e7f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.001 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:34 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:34Z|00698|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.019 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.020 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fb484b7f-b645-4287-93fb-0b2920dcfab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.020 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.021 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'env', 'PROCESS_TAG=haproxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89fdd65f-3dd2-4375-a946-3c5de73cc24a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.051 226890 DEBUG nova.compute.manager [req-b6336549-173c-4855-814c-344acde6eea4 req-8d5e3d56-79ad-46f7-a1db-0a28a47560af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.052 226890 DEBUG oslo_concurrency.lockutils [req-b6336549-173c-4855-814c-344acde6eea4 req-8d5e3d56-79ad-46f7-a1db-0a28a47560af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.053 226890 DEBUG oslo_concurrency.lockutils [req-b6336549-173c-4855-814c-344acde6eea4 req-8d5e3d56-79ad-46f7-a1db-0a28a47560af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.053 226890 DEBUG oslo_concurrency.lockutils [req-b6336549-173c-4855-814c-344acde6eea4 req-8d5e3d56-79ad-46f7-a1db-0a28a47560af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.053 226890 DEBUG nova.compute.manager [req-b6336549-173c-4855-814c-344acde6eea4 req-8d5e3d56-79ad-46f7-a1db-0a28a47560af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Processing event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:02:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:34.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.229 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.230 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921354.228457, 75368220-ff38-456b-a0e6-ae1c02625514 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.230 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Started (Lifecycle Event)#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.233 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.236 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance spawned successfully.#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.237 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.267 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.272 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.276 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.276 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.277 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.277 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.278 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.278 226890 DEBUG nova.virt.libvirt.driver [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.302 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.303 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921354.2296946, 75368220-ff38-456b-a0e6-ae1c02625514 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.303 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.340 226890 INFO nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Took 6.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.341 226890 DEBUG nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.356 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.359 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921354.233599, 75368220-ff38-456b-a0e6-ae1c02625514 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.359 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.395 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.398 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:34 np0005588920 podman[283078]: 2026-01-20 15:02:34.39974464 +0000 UTC m=+0.061148676 container create dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.433 226890 INFO nova.compute.manager [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Took 7.32 seconds to build instance.#033[00m
Jan 20 10:02:34 np0005588920 systemd[1]: Started libpod-conmon-dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70.scope.
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.448 226890 DEBUG oslo_concurrency.lockutils [None req-2420454e-fa89-43fc-a4e6-4066325225f6 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:34 np0005588920 podman[283078]: 2026-01-20 15:02:34.363774543 +0000 UTC m=+0.025178609 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:02:34 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:02:34 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b84031d376d7fae8ada5d377b268129920649b24c89b3cd883f2bb142c08a97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:02:34 np0005588920 podman[283078]: 2026-01-20 15:02:34.49399258 +0000 UTC m=+0.155396616 container init dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:02:34 np0005588920 podman[283078]: 2026-01-20 15:02:34.503474721 +0000 UTC m=+0.164878757 container start dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:02:34 np0005588920 podman[283091]: 2026-01-20 15:02:34.521058463 +0000 UTC m=+0.091618357 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:02:34 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [NOTICE]   (283120) : New worker (283125) forked
Jan 20 10:02:34 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [NOTICE]   (283120) : Loading success.
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.903 226890 DEBUG nova.network.neutron [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.940 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Releasing lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.941 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Instance network_info: |[{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.942 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.942 226890 DEBUG oslo_concurrency.lockutils [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.943 226890 DEBUG nova.network.neutron [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.942 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:34 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:34.943 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.945 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Start _get_guest_xml network_info=[{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:02:18Z,direct_url=<?>,disk_format='raw',id=97fb0fa0-6803-480b-96d2-4a219153376d,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-539580309',owner='3a1d679d5c954662a271e842fe2f2c05',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:02:25Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '97fb0fa0-6803-480b-96d2-4a219153376d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.948 226890 WARNING nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.952 226890 DEBUG nova.virt.libvirt.host [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.953 226890 DEBUG nova.virt.libvirt.host [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.956 226890 DEBUG nova.virt.libvirt.host [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.957 226890 DEBUG nova.virt.libvirt.host [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.958 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.958 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:02:18Z,direct_url=<?>,disk_format='raw',id=97fb0fa0-6803-480b-96d2-4a219153376d,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-539580309',owner='3a1d679d5c954662a271e842fe2f2c05',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:02:25Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.959 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.959 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.959 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.959 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.959 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.960 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.960 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.960 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.960 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.960 226890 DEBUG nova.virt.hardware [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:02:34 np0005588920 nova_compute[226886]: 2026-01-20 15:02:34.963 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:35.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1771894953' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.436 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.466 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.471 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3456037100' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.931 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.933 226890 DEBUG nova.virt.libvirt.vif [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2006426229',display_name='tempest-TestSnapshotPattern-server-2006426229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2006426229',id=156,image_ref='97fb0fa0-6803-480b-96d2-4a219153376d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-rv0mb1db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1',image_min_disk='1',image_min_ram='0',image_owner_id='3a1d679d5c954662a271e842fe2f2c05',image_owner_project_name='tempest-TestSnapshotPattern-1341092631',image_owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member',image_user_id='1654794111844ca88666b3529173e9a7',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:29Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=c2a7aae5-0ef8-400a-acfe-2fbf83144560,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.933 226890 DEBUG nova.network.os_vif_util [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.934 226890 DEBUG nova.network.os_vif_util [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.936 226890 DEBUG nova.objects.instance [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2a7aae5-0ef8-400a-acfe-2fbf83144560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.951 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <uuid>c2a7aae5-0ef8-400a-acfe-2fbf83144560</uuid>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <name>instance-0000009c</name>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestSnapshotPattern-server-2006426229</nova:name>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:02:34</nova:creationTime>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:user uuid="1654794111844ca88666b3529173e9a7">tempest-TestSnapshotPattern-1341092631-project-member</nova:user>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:project uuid="3a1d679d5c954662a271e842fe2f2c05">tempest-TestSnapshotPattern-1341092631</nova:project>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="97fb0fa0-6803-480b-96d2-4a219153376d"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <nova:port uuid="4c88fb15-8276-4e15-8d48-e7ff7412f9be">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="serial">c2a7aae5-0ef8-400a-acfe-2fbf83144560</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="uuid">c2a7aae5-0ef8-400a-acfe-2fbf83144560</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:47:19:22"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <target dev="tap4c88fb15-82"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/console.log" append="off"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:02:35 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:02:35 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:02:35 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:02:35 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.957 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Preparing to wait for external event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.958 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.958 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.958 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.959 226890 DEBUG nova.virt.libvirt.vif [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2006426229',display_name='tempest-TestSnapshotPattern-server-2006426229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2006426229',id=156,image_ref='97fb0fa0-6803-480b-96d2-4a219153376d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-rv0mb1db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1',image_min_disk='1',image_min_ram='0',image_owner_id='3a1d679d5c954662a271e842fe2f2c05',image_owner_project_name='tempest-TestSnapshotPattern-1341092631',image_owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member',image_user_id='1654794111844ca88666b3529173e9a7',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:29Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=c2a7aae5-0ef8-400a-acfe-2fbf83144560,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.960 226890 DEBUG nova.network.os_vif_util [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.960 226890 DEBUG nova.network.os_vif_util [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.961 226890 DEBUG os_vif [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.962 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.962 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.963 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.965 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.966 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c88fb15-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.966 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c88fb15-82, col_values=(('external_ids', {'iface-id': '4c88fb15-8276-4e15-8d48-e7ff7412f9be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:19:22', 'vm-uuid': 'c2a7aae5-0ef8-400a-acfe-2fbf83144560'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:35 np0005588920 NetworkManager[49076]: <info>  [1768921355.9691] manager: (tap4c88fb15-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.971 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:35 np0005588920 nova_compute[226886]: 2026-01-20 15:02:35.975 226890 INFO os_vif [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82')#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.027 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.028 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.028 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No VIF found with MAC fa:16:3e:47:19:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.028 226890 INFO nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Using config drive#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.082 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.172 226890 DEBUG nova.compute.manager [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.173 226890 DEBUG oslo_concurrency.lockutils [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.173 226890 DEBUG oslo_concurrency.lockutils [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.173 226890 DEBUG oslo_concurrency.lockutils [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.173 226890 DEBUG nova.compute.manager [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.173 226890 WARNING nova.compute.manager [req-afd84a90-7df0-49ea-9641-e1a7bed1c9d6 req-7485c6c7-ac7a-43e1-a531-131d4be29e25 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state None.#033[00m
Jan 20 10:02:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:36.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.460 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.946 226890 INFO nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Creating config drive at /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config#033[00m
Jan 20 10:02:36 np0005588920 nova_compute[226886]: 2026-01-20 15:02:36.951 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppg37rxqd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.083 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppg37rxqd" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.116 226890 DEBUG nova.storage.rbd_utils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.120 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:37.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.300 226890 DEBUG oslo_concurrency.processutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.301 226890 INFO nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Deleting local config drive /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560/disk.config because it was imported into RBD.#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.3479] manager: (tap4c88fb15-82): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Jan 20 10:02:37 np0005588920 kernel: tap4c88fb15-82: entered promiscuous mode
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00699|binding|INFO|Claiming lport 4c88fb15-8276-4e15-8d48-e7ff7412f9be for this chassis.
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00700|binding|INFO|4c88fb15-8276-4e15-8d48-e7ff7412f9be: Claiming fa:16:3e:47:19:22 10.100.0.3
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.3758] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.3766] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.379 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:19:22 10.100.0.3'], port_security=['fa:16:3e:47:19:22 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c2a7aae5-0ef8-400a-acfe-2fbf83144560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a1d679d5c954662a271e842fe2f2c05', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f11f0ae2-6b78-4d57-a9ea-5a7c52439262', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=773a665f-440e-445e-8ca6-20a8b67e017a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4c88fb15-8276-4e15-8d48-e7ff7412f9be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.380 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4c88fb15-8276-4e15-8d48-e7ff7412f9be in datapath 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad bound to our chassis#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.382 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad#033[00m
Jan 20 10:02:37 np0005588920 systemd-udevd[283271]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:02:37 np0005588920 systemd-machined[196121]: New machine qemu-71-instance-0000009c.
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1f1fb4-ae0c-47e3-bfb9-de78f842cf47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.394 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap43d3be8f-91 in ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.3961] device (tap4c88fb15-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.396 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap43d3be8f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.396 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0968aa-8c2c-4280-92c2-88cc2d77611b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.3976] device (tap4c88fb15-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.397 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3da566eb-ce79-4d6e-914b-476d49c88344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 systemd[1]: Started Virtual Machine qemu-71-instance-0000009c.
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.407 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2ef229-c725-4bc4-85e7-0c0d637e038d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.433 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c50ab2e7-7fd6-413a-9bd6-4e23c8b3cf0d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.462 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e0067bc6-e79e-4ec8-b691-b38f52653240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.4712] manager: (tap43d3be8f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/344)
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.469 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3818de61-497f-4206-b06c-0d54b54ae2f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.500 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a4db2fff-6beb-4999-b516-d5bb9679bcdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.504 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d78e9034-950d-49ae-bacc-3118197843f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.5275] device (tap43d3be8f-90): carrier: link connected
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.534 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b8926bf5-aa5f-421b-8946-ddcdee05faac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.551 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6dcdcb07-1326-49c2-84fe-dd51d8a02df9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43d3be8f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:0f:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639921, 'reachable_time': 26305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283304, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.566 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f927aa4-8adb-440a-a614-3057558d3815]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:f60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639921, 'tstamp': 639921}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283305, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.583 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6011c8-6530-4060-aa93-b24d882db0c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43d3be8f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:0f:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639921, 'reachable_time': 26305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283306, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.615 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6c87313e-bdd9-48f4-9d7c-78f44807a5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.621 226890 DEBUG nova.network.neutron [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updated VIF entry in instance network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.622 226890 DEBUG nova.network.neutron [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.624 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00701|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00702|binding|INFO|Setting lport 4c88fb15-8276-4e15-8d48-e7ff7412f9be ovn-installed in OVS
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00703|binding|INFO|Setting lport 4c88fb15-8276-4e15-8d48-e7ff7412f9be up in Southbound
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.667 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.673 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3daaea52-9c1b-4d1d-8854-23a1049a551e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.674 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43d3be8f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.675 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.675 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43d3be8f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.677 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 NetworkManager[49076]: <info>  [1768921357.6778] manager: (tap43d3be8f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Jan 20 10:02:37 np0005588920 kernel: tap43d3be8f-90: entered promiscuous mode
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.680 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.682 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43d3be8f-90, col_values=(('external_ids', {'iface-id': '32afa112-2ec4-4d59-b6eb-a77db2858bd4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.683 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:37Z|00704|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=1)
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.699 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.700 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.701 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8064958b-e355-46c2-b2cb-05a3218d8eba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.702 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:02:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:37.705 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'env', 'PROCESS_TAG=haproxy-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.840 226890 DEBUG oslo_concurrency.lockutils [req-bf2d0ad0-db15-45a9-af8b-eb339d0241c9 req-1bd8faa9-4540-48e9-8507-dc5a2a657377 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.883 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921357.8828156, c2a7aae5-0ef8-400a-acfe-2fbf83144560 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.883 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] VM Started (Lifecycle Event)#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.907 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.909 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921357.8837616, c2a7aae5-0ef8-400a-acfe-2fbf83144560 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.910 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.932 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:37 np0005588920 nova_compute[226886]: 2026-01-20 15:02:37.935 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.062 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:38 np0005588920 podman[283380]: 2026-01-20 15:02:38.085860099 +0000 UTC m=+0.050569355 container create c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 10:02:38 np0005588920 systemd[1]: Started libpod-conmon-c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838.scope.
Jan 20 10:02:38 np0005588920 podman[283380]: 2026-01-20 15:02:38.056540622 +0000 UTC m=+0.021249898 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:02:38 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:02:38 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c974dfea2b9a98e0d33c53a4d1ce5575851301b359ed99b0ecc17fa80578d8e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:02:38 np0005588920 podman[283380]: 2026-01-20 15:02:38.187338286 +0000 UTC m=+0.152047572 container init c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 10:02:38 np0005588920 podman[283380]: 2026-01-20 15:02:38.19275895 +0000 UTC m=+0.157468206 container start c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:02:38 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [NOTICE]   (283400) : New worker (283402) forked
Jan 20 10:02:38 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [NOTICE]   (283400) : Loading success.
Jan 20 10:02:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:38.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.908 226890 DEBUG nova.compute.manager [req-63bbbcc0-c966-4618-8e1f-85c87ac6e037 req-e300f903-5918-4a15-af1f-72f13381c7b7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.909 226890 DEBUG oslo_concurrency.lockutils [req-63bbbcc0-c966-4618-8e1f-85c87ac6e037 req-e300f903-5918-4a15-af1f-72f13381c7b7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.909 226890 DEBUG oslo_concurrency.lockutils [req-63bbbcc0-c966-4618-8e1f-85c87ac6e037 req-e300f903-5918-4a15-af1f-72f13381c7b7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.909 226890 DEBUG oslo_concurrency.lockutils [req-63bbbcc0-c966-4618-8e1f-85c87ac6e037 req-e300f903-5918-4a15-af1f-72f13381c7b7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.910 226890 DEBUG nova.compute.manager [req-63bbbcc0-c966-4618-8e1f-85c87ac6e037 req-e300f903-5918-4a15-af1f-72f13381c7b7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Processing event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.910 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.914 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921358.9143617, c2a7aae5-0ef8-400a-acfe-2fbf83144560 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.914 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.916 226890 DEBUG nova.virt.libvirt.driver [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.920 226890 INFO nova.virt.libvirt.driver [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Instance spawned successfully.#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.920 226890 INFO nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Took 9.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.921 226890 DEBUG nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.933 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.935 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.964 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:38 np0005588920 nova_compute[226886]: 2026-01-20 15:02:38.989 226890 INFO nova.compute.manager [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Took 10.01 seconds to build instance.#033[00m
Jan 20 10:02:39 np0005588920 nova_compute[226886]: 2026-01-20 15:02:39.004 226890 DEBUG oslo_concurrency.lockutils [None req-7d0e8bdf-ca8c-4056-8059-5e67f5d1e239 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:39 np0005588920 nova_compute[226886]: 2026-01-20 15:02:39.041 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921344.0399916, 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:39 np0005588920 nova_compute[226886]: 2026-01-20 15:02:39.042 226890 INFO nova.compute.manager [-] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:02:39 np0005588920 nova_compute[226886]: 2026-01-20 15:02:39.058 226890 DEBUG nova.compute.manager [None req-b3411587-769d-42a7-81ef-b89ab08dc859 - - - - - -] [instance: 18f2cf64-c2d5-4f0b-a16e-48cf2f558c10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:39.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:40.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.993 226890 DEBUG nova.compute.manager [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-changed-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.994 226890 DEBUG nova.compute.manager [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Refreshing instance network info cache due to event network-changed-27ba7c79-863a-4084-a5df-ee7a70ec6e0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.994 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.994 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:40 np0005588920 nova_compute[226886]: 2026-01-20 15:02:40.995 226890 DEBUG nova.network.neutron [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Refreshing network info cache for port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:41.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:41 np0005588920 nova_compute[226886]: 2026-01-20 15:02:41.462 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:42.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:42Z|00705|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:02:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:42Z|00706|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:42 np0005588920 nova_compute[226886]: 2026-01-20 15:02:42.659 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.082 226890 DEBUG nova.network.neutron [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updated VIF entry in instance network info cache for port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.082 226890 DEBUG nova.network.neutron [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.106 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.107 226890 DEBUG nova.compute.manager [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.108 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.108 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.109 226890 DEBUG oslo_concurrency.lockutils [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.109 226890 DEBUG nova.compute.manager [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] No waiting events found dispatching network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.110 226890 WARNING nova.compute.manager [req-a38d50c7-a70e-424b-89cb-4750db73f175 req-cf864243-19f5-4795-9f94-1190000d2aea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received unexpected event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be for instance with vm_state active and task_state None.#033[00m
Jan 20 10:02:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:43.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.447 226890 DEBUG nova.compute.manager [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.448 226890 DEBUG nova.compute.manager [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing instance network info cache due to event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.448 226890 DEBUG oslo_concurrency.lockutils [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.448 226890 DEBUG oslo_concurrency.lockutils [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:43 np0005588920 nova_compute[226886]: 2026-01-20 15:02:43.449 226890 DEBUG nova.network.neutron [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:44 np0005588920 podman[283411]: 2026-01-20 15:02:43.999799157 +0000 UTC m=+0.084431121 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:02:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:44.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:02:44.944 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:45.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:45 np0005588920 nova_compute[226886]: 2026-01-20 15:02:45.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:46.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:46 np0005588920 nova_compute[226886]: 2026-01-20 15:02:46.465 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:46 np0005588920 nova_compute[226886]: 2026-01-20 15:02:46.676 226890 DEBUG nova.network.neutron [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updated VIF entry in instance network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:46 np0005588920 nova_compute[226886]: 2026-01-20 15:02:46.677 226890 DEBUG nova.network.neutron [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:46 np0005588920 nova_compute[226886]: 2026-01-20 15:02:46.700 226890 DEBUG oslo_concurrency.lockutils [req-cda81840-5dc9-4817-9813-ff2f20c98ed4 req-16947659-a910-4135-b703-1590eb0e9f4b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:47Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:02:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:47Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:02:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:47.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:47Z|00707|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:02:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:47Z|00708|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:47 np0005588920 nova_compute[226886]: 2026-01-20 15:02:47.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:47 np0005588920 nova_compute[226886]: 2026-01-20 15:02:47.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:48.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:48 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:48 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:02:48 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:02:49 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.997 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:49 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.998 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:49 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.998 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:02:49 np0005588920 nova_compute[226886]: 2026-01-20 15:02:48.998 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:49.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:50.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:50 np0005588920 nova_compute[226886]: 2026-01-20 15:02:50.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:51.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.839 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.861 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.861 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.862 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.886 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.887 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.887 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.887 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:02:51 np0005588920 nova_compute[226886]: 2026-01-20 15:02:51.888 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:52.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/51400553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.336 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.438 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.438 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.442 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.443 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:52 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:52Z|00091|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.3
Jan 20 10:02:52 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:52Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:47:19:22 10.100.0.3
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.605 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.606 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3937MB free_disk=20.854991912841797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.606 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.607 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 75368220-ff38-456b-a0e6-ae1c02625514 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance c2a7aae5-0ef8-400a-acfe-2fbf83144560 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:02:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:52 np0005588920 nova_compute[226886]: 2026-01-20 15:02:52.705 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2057302993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:53 np0005588920 nova_compute[226886]: 2026-01-20 15:02:53.160 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:53 np0005588920 nova_compute[226886]: 2026-01-20 15:02:53.165 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:53 np0005588920 nova_compute[226886]: 2026-01-20 15:02:53.179 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:53 np0005588920 nova_compute[226886]: 2026-01-20 15:02:53.198 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:02:53 np0005588920 nova_compute[226886]: 2026-01-20 15:02:53.199 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:53.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.062 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.062 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.063 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.063 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.063 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.063 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:54 np0005588920 nova_compute[226886]: 2026-01-20 15:02:54.063 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:02:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:54.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:55.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:55 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:55Z|00093|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.3
Jan 20 10:02:55 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:55Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:47:19:22 10.100.0.3
Jan 20 10:02:55 np0005588920 nova_compute[226886]: 2026-01-20 15:02:55.975 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:56.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.468 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.519 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.520 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.533 226890 DEBUG nova.objects.instance [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.567 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.800 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.800 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.801 226890 INFO nova.compute.manager [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Attaching volume d50f7be5-09c1-4898-894c-704176a797ac to /dev/vdb#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.965 226890 DEBUG os_brick.utils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.966 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.978 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.979 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[31ea1315-dc11-491f-a69c-3fac3b06b7a2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.980 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.987 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.987 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7231eb-0a2d-419e-b02d-a8b39fdc754e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.988 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.996 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.996 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf2151f-40f8-450d-aa29-2288458f6cf7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.998 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[85590d1e-d6d8-462a-93f1-940f0233b08c]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:56 np0005588920 nova_compute[226886]: 2026-01-20 15:02:56.998 226890 DEBUG oslo_concurrency.processutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.025 226890 DEBUG oslo_concurrency.processutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.029 226890 DEBUG os_brick.initiator.connectors.lightos [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.029 226890 DEBUG os_brick.initiator.connectors.lightos [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.029 226890 DEBUG os_brick.initiator.connectors.lightos [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.030 226890 DEBUG os_brick.utils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] <== get_connector_properties: return (64ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.030 226890 DEBUG nova.virt.block_device [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating existing volume attachment record: 7fdba019-6b66-4b1c-9d5b-15c02e510fba _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:02:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:57.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:57 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:57Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:19:22 10.100.0.3
Jan 20 10:02:57 np0005588920 ovn_controller[133971]: 2026-01-20T15:02:57Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:19:22 10.100.0.3
Jan 20 10:02:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.861 226890 DEBUG nova.objects.instance [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.889 226890 DEBUG nova.virt.libvirt.driver [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Attempting to attach volume d50f7be5-09c1-4898-894c-704176a797ac with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.892 226890 DEBUG nova.virt.libvirt.guest [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d50f7be5-09c1-4898-894c-704176a797ac">
Jan 20 10:02:57 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:02:57 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:02:57 np0005588920 nova_compute[226886]:  <serial>d50f7be5-09c1-4898-894c-704176a797ac</serial>
Jan 20 10:02:57 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:02:57 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:02:57 np0005588920 nova_compute[226886]: 2026-01-20 15:02:57.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:58 np0005588920 nova_compute[226886]: 2026-01-20 15:02:58.000 226890 DEBUG nova.virt.libvirt.driver [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:58 np0005588920 nova_compute[226886]: 2026-01-20 15:02:58.001 226890 DEBUG nova.virt.libvirt.driver [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:58 np0005588920 nova_compute[226886]: 2026-01-20 15:02:58.001 226890 DEBUG nova.virt.libvirt.driver [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:58 np0005588920 nova_compute[226886]: 2026-01-20 15:02:58.001 226890 DEBUG nova.virt.libvirt.driver [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No VIF found with MAC fa:16:3e:3d:87:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:02:58 np0005588920 nova_compute[226886]: 2026-01-20 15:02:58.158 226890 DEBUG oslo_concurrency.lockutils [None req-125a2181-f264-477b-84d9-2d6f1393d2d2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:58.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:02:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:59.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.547 226890 DEBUG oslo_concurrency.lockutils [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.548 226890 DEBUG oslo_concurrency.lockutils [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.548 226890 DEBUG nova.compute.manager [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.551 226890 DEBUG nova.compute.manager [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.551 226890 DEBUG nova.objects.instance [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:59 np0005588920 nova_compute[226886]: 2026-01-20 15:02:59.573 226890 DEBUG nova.virt.libvirt.driver [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:03:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:00.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:00 np0005588920 nova_compute[226886]: 2026-01-20 15:03:00.978 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:01.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:01 np0005588920 nova_compute[226886]: 2026-01-20 15:03:01.470 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588920 kernel: tap27ba7c79-86 (unregistering): left promiscuous mode
Jan 20 10:03:01 np0005588920 NetworkManager[49076]: <info>  [1768921381.8508] device (tap27ba7c79-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:01Z|00709|binding|INFO|Releasing lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d from this chassis (sb_readonly=0)
Jan 20 10:03:01 np0005588920 nova_compute[226886]: 2026-01-20 15:03:01.905 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:01Z|00710|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d down in Southbound
Jan 20 10:03:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:01Z|00711|binding|INFO|Removing iface tap27ba7c79-86 ovn-installed in OVS
Jan 20 10:03:01 np0005588920 nova_compute[226886]: 2026-01-20 15:03:01.906 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:01.911 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:01.912 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a unbound from our chassis#033[00m
Jan 20 10:03:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:01.913 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:01.914 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba433970-6129-4dba-9674-3d3f6fb40452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:01.915 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace which is not needed anymore#033[00m
Jan 20 10:03:01 np0005588920 nova_compute[226886]: 2026-01-20 15:03:01.922 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588920 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 20 10:03:01 np0005588920 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000009b.scope: Consumed 13.617s CPU time.
Jan 20 10:03:01 np0005588920 systemd-machined[196121]: Machine qemu-70-instance-0000009b terminated.
Jan 20 10:03:02 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [NOTICE]   (283120) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:02 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [NOTICE]   (283120) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:02 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [WARNING]  (283120) : Exiting Master process...
Jan 20 10:03:02 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [ALERT]    (283120) : Current worker (283125) exited with code 143 (Terminated)
Jan 20 10:03:02 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283104]: [WARNING]  (283120) : All workers exited. Exiting... (0)
Jan 20 10:03:02 np0005588920 systemd[1]: libpod-dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70.scope: Deactivated successfully.
Jan 20 10:03:02 np0005588920 podman[283527]: 2026-01-20 15:03:02.071108018 +0000 UTC m=+0.048387712 container died dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:03:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:02 np0005588920 systemd[1]: var-lib-containers-storage-overlay-6b84031d376d7fae8ada5d377b268129920649b24c89b3cd883f2bb142c08a97-merged.mount: Deactivated successfully.
Jan 20 10:03:02 np0005588920 podman[283527]: 2026-01-20 15:03:02.111496941 +0000 UTC m=+0.088776635 container cleanup dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:03:02 np0005588920 systemd[1]: libpod-conmon-dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70.scope: Deactivated successfully.
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:02.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:02 np0005588920 podman[283559]: 2026-01-20 15:03:02.292844508 +0000 UTC m=+0.159796903 container remove dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.300 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[af6ee80b-762f-4048-9aa4-d76cbdd3b1d6]: (4, ('Tue Jan 20 03:03:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70)\ndde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70\nTue Jan 20 03:03:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (dde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70)\ndde1dedbf45cb3997ee97375206ce36af75b3cba83b284fd11c92992fa82fe70\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.302 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[91e21d73-a5df-4380-9063-e8dab22b490c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.303 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:02 np0005588920 kernel: tap89fdd65f-30: left promiscuous mode
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.326 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c10c9d5a-7b57-42f9-b091-06413c36127e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.343 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dc538020-bc1f-41fb-92e0-b9ada43648ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.345 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ab196658-209d-40ab-8f20-d47fffd7b03f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.361 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c3984154-b686-44e9-a469-2d246f81f68a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639547, 'reachable_time': 40616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283583, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.363 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:02.364 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[1a803e01-d881-47a1-a8d3-0f6bdb6d7345]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:02 np0005588920 systemd[1]: run-netns-ovnmeta\x2d89fdd65f\x2d3dd2\x2d4375\x2da946\x2d3c5de73cc24a.mount: Deactivated successfully.
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.588 226890 INFO nova.virt.libvirt.driver [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.594 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance destroyed successfully.#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.594 226890 DEBUG nova.objects.instance [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.614 226890 DEBUG nova.compute.manager [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.658 226890 DEBUG oslo_concurrency.lockutils [None req-636724e1-1c93-429f-baae-f4ad802e48d1 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.719 226890 DEBUG nova.compute.manager [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.719 226890 DEBUG oslo_concurrency.lockutils [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.719 226890 DEBUG oslo_concurrency.lockutils [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.720 226890 DEBUG oslo_concurrency.lockutils [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.720 226890 DEBUG nova.compute.manager [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:02 np0005588920 nova_compute[226886]: 2026-01-20 15:03:02.720 226890 WARNING nova.compute.manager [req-1e73c217-59a7-4194-be6f-e32e33a71874 req-f825c1aa-f8ad-457c-9428-bce1a425188b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state stopped and task_state None.#033[00m
Jan 20 10:03:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:03.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:03 np0005588920 nova_compute[226886]: 2026-01-20 15:03:03.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:04.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.640 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.661 226890 DEBUG oslo_concurrency.lockutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.662 226890 DEBUG oslo_concurrency.lockutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.662 226890 DEBUG nova.network.neutron [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.662 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'info_cache' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.905 226890 DEBUG nova.compute.manager [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.905 226890 DEBUG oslo_concurrency.lockutils [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.906 226890 DEBUG oslo_concurrency.lockutils [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.906 226890 DEBUG oslo_concurrency.lockutils [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.906 226890 DEBUG nova.compute.manager [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:04 np0005588920 nova_compute[226886]: 2026-01-20 15:03:04.906 226890 WARNING nova.compute.manager [req-82046c7e-f503-4a2e-a6ba-4f4f68de7d49 req-436ba07f-2bd8-4dd2-bdbe-dbd21a9212d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 10:03:04 np0005588920 podman[283584]: 2026-01-20 15:03:04.986019422 +0000 UTC m=+0.076527866 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 10:03:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:05 np0005588920 nova_compute[226886]: 2026-01-20 15:03:05.979 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:06.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.473 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.679 226890 DEBUG nova.network.neutron [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.720 226890 DEBUG oslo_concurrency.lockutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.752 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance destroyed successfully.#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.753 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.778 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'resources' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.791 226890 DEBUG nova.virt.libvirt.vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.791 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.792 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.792 226890 DEBUG os_vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.794 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.795 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27ba7c79-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.796 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.797 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.800 226890 INFO os_vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.809 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start _get_guest_xml network_info=[{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': None, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-d50f7be5-09c1-4898-894c-704176a797ac', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'd50f7be5-09c1-4898-894c-704176a797ac', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '75368220-ff38-456b-a0e6-ae1c02625514', 'attached_at': '', 'detached_at': '', 'volume_id': 'd50f7be5-09c1-4898-894c-704176a797ac', 'serial': 'd50f7be5-09c1-4898-894c-704176a797ac'}, 'mount_device': '/dev/vdb', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '7fdba019-6b66-4b1c-9d5b-15c02e510fba', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.813 226890 WARNING nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.821 226890 DEBUG nova.virt.libvirt.host [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.821 226890 DEBUG nova.virt.libvirt.host [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.824 226890 DEBUG nova.virt.libvirt.host [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.825 226890 DEBUG nova.virt.libvirt.host [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.826 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.826 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.827 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.827 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.827 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.828 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.828 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.828 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.829 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.829 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.829 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.830 226890 DEBUG nova.virt.hardware [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.830 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:06 np0005588920 nova_compute[226886]: 2026-01-20 15:03:06.846 226890 DEBUG oslo_concurrency.processutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4071570443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.329 226890 DEBUG oslo_concurrency.processutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:07.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.360 226890 DEBUG oslo_concurrency.processutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/424647466' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.799 226890 DEBUG oslo_concurrency.processutils [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.822 226890 DEBUG nova.virt.libvirt.vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.822 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.823 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.824 226890 DEBUG nova.objects.instance [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.853 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <uuid>75368220-ff38-456b-a0e6-ae1c02625514</uuid>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <name>instance-0000009b</name>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeTestJSON-server-284183767</nova:name>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:03:06</nova:creationTime>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:user uuid="912329b1a6ad42bdb72e952c03983bdf">tempest-AttachVolumeTestJSON-583320363-project-member</nova:user>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:project uuid="96f7b14c2a9348f08305fe232df2a603">tempest-AttachVolumeTestJSON-583320363</nova:project>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <nova:port uuid="27ba7c79-863a-4084-a5df-ee7a70ec6e0d">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="serial">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="uuid">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk.config">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-d50f7be5-09c1-4898-894c-704176a797ac">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <serial>d50f7be5-09c1-4898-894c-704176a797ac</serial>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:3d:87:66"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <target dev="tap27ba7c79-86"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/console.log" append="off"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:03:07 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:03:07 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:03:07 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:03:07 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.855 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.855 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.856 226890 DEBUG nova.virt.libvirt.driver [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.856 226890 DEBUG nova.virt.libvirt.vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.857 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.857 226890 DEBUG nova.network.os_vif_util [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.858 226890 DEBUG os_vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.858 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.859 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.859 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.861 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.861 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27ba7c79-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.862 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27ba7c79-86, col_values=(('external_ids', {'iface-id': '27ba7c79-863a-4084-a5df-ee7a70ec6e0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:87:66', 'vm-uuid': '75368220-ff38-456b-a0e6-ae1c02625514'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.864 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 NetworkManager[49076]: <info>  [1768921387.8647] manager: (tap27ba7c79-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.868 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.869 226890 INFO os_vif [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:03:07 np0005588920 kernel: tap27ba7c79-86: entered promiscuous mode
Jan 20 10:03:07 np0005588920 NetworkManager[49076]: <info>  [1768921387.9454] manager: (tap27ba7c79-86): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.946 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:07Z|00712|binding|INFO|Claiming lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d for this chassis.
Jan 20 10:03:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:07Z|00713|binding|INFO|27ba7c79-863a-4084-a5df-ee7a70ec6e0d: Claiming fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.953 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.954 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a bound to our chassis#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.956 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:07Z|00714|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d up in Southbound
Jan 20 10:03:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:07Z|00715|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d ovn-installed in OVS
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.965 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 nova_compute[226886]: 2026-01-20 15:03:07.967 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.968 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[391c388d-376f-4666-9ab8-87dbef1a3e28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.970 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89fdd65f-31 in ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.972 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89fdd65f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.972 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a4016c-5e9c-4251-8782-eac4a21eb6b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.973 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c173fe48-ecb0-42aa-8e8f-87202912eca2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:07 np0005588920 systemd-udevd[283687]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:03:07 np0005588920 systemd-machined[196121]: New machine qemu-72-instance-0000009b.
Jan 20 10:03:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:07.986 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[3e58def7-c691-4e40-8978-c55f848e3e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:07 np0005588920 systemd[1]: Started Virtual Machine qemu-72-instance-0000009b.
Jan 20 10:03:07 np0005588920 NetworkManager[49076]: <info>  [1768921387.9942] device (tap27ba7c79-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:03:07 np0005588920 NetworkManager[49076]: <info>  [1768921387.9949] device (tap27ba7c79-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.012 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b4876abb-256e-4df1-a6ab-53608f172bea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.044 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[54a3c480-f29c-44a5-a084-5228fff633f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.050 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[adf1cd71-46a5-4329-988b-cd4ad26b9c71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 NetworkManager[49076]: <info>  [1768921388.0520] manager: (tap89fdd65f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/348)
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.079 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[245b2020-f14e-4ab2-a597-7a9685889868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.082 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9061dcab-88f6-48d0-a97f-b929f804d8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 NetworkManager[49076]: <info>  [1768921388.1026] device (tap89fdd65f-30): carrier: link connected
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.107 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b1178288-6c36-4cc3-9abe-e1f500d4e587]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.124 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[46155e06-79c0-4245-a584-3066df85b50c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642978, 'reachable_time': 28804, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283720, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.143 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ef302a93-cb88-4d6c-98b8-64459b5c7b8f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:d33d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642978, 'tstamp': 642978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283721, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.161 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f1bf9a64-f042-4968-a528-c5ff6b952c53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642978, 'reachable_time': 28804, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283722, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.191 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[292aaf35-4a57-4054-8675-066a8ebd4aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.251 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e138c04d-c838-4053-a628-1898e4c424db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.256 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.257 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.257 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89fdd65f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588920 kernel: tap89fdd65f-30: entered promiscuous mode
Jan 20 10:03:08 np0005588920 NetworkManager[49076]: <info>  [1768921388.2599] manager: (tap89fdd65f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.259 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.261 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.262 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89fdd65f-30, col_values=(('external_ids', {'iface-id': '58f1013f-2d8d-46a7-97e6-2062537e7f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.263 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:08Z|00716|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.277 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.278 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.279 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[33b27b19-1fa2-4f59-988b-d53eaf2fcec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.280 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:03:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:08.281 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'env', 'PROCESS_TAG=haproxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89fdd65f-3dd2-4375-a946-3c5de73cc24a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:03:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:08.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.548 226890 DEBUG nova.compute.manager [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.548 226890 DEBUG oslo_concurrency.lockutils [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.549 226890 DEBUG oslo_concurrency.lockutils [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.550 226890 DEBUG oslo_concurrency.lockutils [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.550 226890 DEBUG nova.compute.manager [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.551 226890 WARNING nova.compute.manager [req-c30c0504-e595-4784-b7fc-0320df0d8c56 req-313c9bd4-8ae6-45ac-9982-51ce2abbe212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 10:03:08 np0005588920 podman[283813]: 2026-01-20 15:03:08.669686341 +0000 UTC m=+0.049898036 container create 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.694 226890 DEBUG nova.compute.manager [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.694 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 75368220-ff38-456b-a0e6-ae1c02625514 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.694 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921388.6923966, 75368220-ff38-456b-a0e6-ae1c02625514 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.694 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.703 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance rebooted successfully.#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.703 226890 DEBUG nova.compute.manager [None req-3289cd4a-0b86-49af-8f77-782b68c64d4d 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:08 np0005588920 systemd[1]: Started libpod-conmon-7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc.scope.
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.738 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:08 np0005588920 podman[283813]: 2026-01-20 15:03:08.646068416 +0000 UTC m=+0.026280121 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.741 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:08 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.766 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.766 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921388.6928895, 75368220-ff38-456b-a0e6-ae1c02625514 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.766 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Started (Lifecycle Event)#033[00m
Jan 20 10:03:08 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ddbd2a1692e68c4f9d445f60b5eb8fdf5eb24aa122bc4b76b42eafde183bea0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.797 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:08 np0005588920 nova_compute[226886]: 2026-01-20 15:03:08.800 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:08 np0005588920 podman[283813]: 2026-01-20 15:03:08.808919706 +0000 UTC m=+0.189131431 container init 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:03:08 np0005588920 podman[283813]: 2026-01-20 15:03:08.814388102 +0000 UTC m=+0.194599797 container start 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:03:08 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [NOTICE]   (283833) : New worker (283835) forked
Jan 20 10:03:08 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [NOTICE]   (283833) : Loading success.
Jan 20 10:03:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:09.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:10.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.706 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.726 226890 DEBUG nova.compute.manager [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.727 226890 DEBUG oslo_concurrency.lockutils [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.727 226890 DEBUG oslo_concurrency.lockutils [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.727 226890 DEBUG oslo_concurrency.lockutils [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.727 226890 DEBUG nova.compute.manager [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:10 np0005588920 nova_compute[226886]: 2026-01-20 15:03:10.727 226890 WARNING nova.compute.manager [req-eb554bea-2eca-4ed0-afbc-98602630ebf3 req-3e24594d-2d16-4e6b-ad03-956bc65f663c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state None.#033[00m
Jan 20 10:03:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:11.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:11 np0005588920 nova_compute[226886]: 2026-01-20 15:03:11.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:12.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:12 np0005588920 nova_compute[226886]: 2026-01-20 15:03:12.864 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:13.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:14.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:14 np0005588920 podman[283844]: 2026-01-20 15:03:14.961124805 +0000 UTC m=+0.046151679 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 10:03:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:15.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:15 np0005588920 nova_compute[226886]: 2026-01-20 15:03:15.412 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:16.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:16 np0005588920 nova_compute[226886]: 2026-01-20 15:03:16.420 226890 DEBUG nova.compute.manager [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:16.463 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:16.464 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:16.465 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:16 np0005588920 nova_compute[226886]: 2026-01-20 15:03:16.479 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:16 np0005588920 nova_compute[226886]: 2026-01-20 15:03:16.493 226890 INFO nova.compute.manager [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] instance snapshotting#033[00m
Jan 20 10:03:16 np0005588920 nova_compute[226886]: 2026-01-20 15:03:16.890 226890 INFO nova.virt.libvirt.driver [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Beginning live snapshot process#033[00m
Jan 20 10:03:17 np0005588920 nova_compute[226886]: 2026-01-20 15:03:17.290 226890 DEBUG nova.storage.rbd_utils [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] creating snapshot(2a0a202939dd46cb8f014530d081d565) on rbd image(c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:03:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:17.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 20 10:03:17 np0005588920 nova_compute[226886]: 2026-01-20 15:03:17.562 226890 DEBUG nova.storage.rbd_utils [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] cloning vms/c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk@2a0a202939dd46cb8f014530d081d565 to images/fc0edfd5-d120-40a0-889d-d3576a58edb6 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:03:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:17 np0005588920 nova_compute[226886]: 2026-01-20 15:03:17.761 226890 DEBUG nova.storage.rbd_utils [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] flattening images/fc0edfd5-d120-40a0-889d-d3576a58edb6 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:03:17 np0005588920 nova_compute[226886]: 2026-01-20 15:03:17.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:18.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:18 np0005588920 nova_compute[226886]: 2026-01-20 15:03:18.332 226890 DEBUG nova.storage.rbd_utils [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] removing snapshot(2a0a202939dd46cb8f014530d081d565) on rbd image(c2a7aae5-0ef8-400a-acfe-2fbf83144560_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:03:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 20 10:03:18 np0005588920 nova_compute[226886]: 2026-01-20 15:03:18.556 226890 DEBUG nova.storage.rbd_utils [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] creating snapshot(snap) on rbd image(fc0edfd5-d120-40a0-889d-d3576a58edb6) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:03:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:19.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 20 10:03:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:20.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:21.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:21 np0005588920 nova_compute[226886]: 2026-01-20 15:03:21.481 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:21Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:03:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:22.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:22 np0005588920 nova_compute[226886]: 2026-01-20 15:03:22.384 226890 INFO nova.virt.libvirt.driver [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Snapshot image upload complete#033[00m
Jan 20 10:03:22 np0005588920 nova_compute[226886]: 2026-01-20 15:03:22.385 226890 INFO nova.compute.manager [None req-947b6369-a007-40f8-8dfd-b6087065b7f6 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Took 5.89 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 10:03:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:22 np0005588920 nova_compute[226886]: 2026-01-20 15:03:22.894 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:23.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 20 10:03:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:24.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 20 10:03:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:25.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:03:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:25 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:03:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:26.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.483 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.730 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.731 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.752 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.990 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:26 np0005588920 nova_compute[226886]: 2026-01-20 15:03:26.991 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.031 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.031 226890 INFO nova.compute.claims [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:03:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:27.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.418 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/817872846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.889 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.896 226890 DEBUG nova.compute.provider_tree [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.898 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.927 226890 DEBUG nova.scheduler.client.report [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.992 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:27 np0005588920 nova_compute[226886]: 2026-01-20 15:03:27.992 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.025 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.026 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.026 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.026 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.026 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.028 226890 INFO nova.compute.manager [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Terminating instance#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.029 226890 DEBUG nova.compute.manager [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.078 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.078 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.153 226890 INFO nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:03:28 np0005588920 kernel: tap4c88fb15-82 (unregistering): left promiscuous mode
Jan 20 10:03:28 np0005588920 NetworkManager[49076]: <info>  [1768921408.2456] device (tap4c88fb15-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:28Z|00717|binding|INFO|Releasing lport 4c88fb15-8276-4e15-8d48-e7ff7412f9be from this chassis (sb_readonly=0)
Jan 20 10:03:28 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:28Z|00718|binding|INFO|Setting lport 4c88fb15-8276-4e15-8d48-e7ff7412f9be down in Southbound
Jan 20 10:03:28 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:28Z|00719|binding|INFO|Removing iface tap4c88fb15-82 ovn-installed in OVS
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.258 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.268 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:19:22 10.100.0.3'], port_security=['fa:16:3e:47:19:22 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c2a7aae5-0ef8-400a-acfe-2fbf83144560', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a1d679d5c954662a271e842fe2f2c05', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f11f0ae2-6b78-4d57-a9ea-5a7c52439262', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=773a665f-440e-445e-8ca6-20a8b67e017a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4c88fb15-8276-4e15-8d48-e7ff7412f9be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.269 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4c88fb15-8276-4e15-8d48-e7ff7412f9be in datapath 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad unbound from our chassis#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.271 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.272 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f22e01d2-907b-4f8d-b067-65c84269ac24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.272 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad namespace which is not needed anymore#033[00m
Jan 20 10:03:28 np0005588920 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 20 10:03:28 np0005588920 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009c.scope: Consumed 15.947s CPU time.
Jan 20 10:03:28 np0005588920 systemd-machined[196121]: Machine qemu-71-instance-0000009c terminated.
Jan 20 10:03:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:28.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.326 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.365 226890 DEBUG nova.compute.manager [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.366 226890 DEBUG nova.compute.manager [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing instance network info cache due to event network-changed-4c88fb15-8276-4e15-8d48-e7ff7412f9be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.366 226890 DEBUG oslo_concurrency.lockutils [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.366 226890 DEBUG oslo_concurrency.lockutils [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.366 226890 DEBUG nova.network.neutron [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Refreshing network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [NOTICE]   (283400) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [NOTICE]   (283400) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [WARNING]  (283400) : Exiting Master process...
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [WARNING]  (283400) : Exiting Master process...
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [ALERT]    (283400) : Current worker (283402) exited with code 143 (Terminated)
Jan 20 10:03:28 np0005588920 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[283396]: [WARNING]  (283400) : All workers exited. Exiting... (0)
Jan 20 10:03:28 np0005588920 systemd[1]: libpod-c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838.scope: Deactivated successfully.
Jan 20 10:03:28 np0005588920 podman[284180]: 2026-01-20 15:03:28.403689296 +0000 UTC m=+0.044200062 container died c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:03:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:28 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c974dfea2b9a98e0d33c53a4d1ce5575851301b359ed99b0ecc17fa80578d8e9-merged.mount: Deactivated successfully.
Jan 20 10:03:28 np0005588920 podman[284180]: 2026-01-20 15:03:28.453069576 +0000 UTC m=+0.093580352 container cleanup c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:03:28 np0005588920 systemd[1]: libpod-conmon-c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838.scope: Deactivated successfully.
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.472 226890 INFO nova.virt.libvirt.driver [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Instance destroyed successfully.#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.473 226890 DEBUG nova.objects.instance [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'resources' on Instance uuid c2a7aae5-0ef8-400a-acfe-2fbf83144560 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.485 226890 DEBUG nova.virt.libvirt.vif [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2006426229',display_name='tempest-TestSnapshotPattern-server-2006426229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2006426229',id=156,image_ref='97fb0fa0-6803-480b-96d2-4a219153376d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-rv0mb1db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1',image_min_disk='1',image_min_ram='0',image_owner_id='3a1d679d5c954662a271e842fe2f2c05',image_owner_project_name='tempest-TestSnapshotPattern-1341092631',image_owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member',image_user_id='1654794111844ca88666b3529173e9a7',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:22Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=c2a7aae5-0ef8-400a-acfe-2fbf83144560,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.486 226890 DEBUG nova.network.os_vif_util [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.486 226890 DEBUG nova.network.os_vif_util [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.486 226890 DEBUG os_vif [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.489 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c88fb15-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.494 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.496 226890 INFO os_vif [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:19:22,bridge_name='br-int',has_traffic_filtering=True,id=4c88fb15-8276-4e15-8d48-e7ff7412f9be,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c88fb15-82')#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.521 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.522 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.522 226890 INFO nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating image(s)#033[00m
Jan 20 10:03:28 np0005588920 podman[284215]: 2026-01-20 15:03:28.540112141 +0000 UTC m=+0.059979323 container remove c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.545 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc68bfa-50dd-43a7-b66d-582e7b514184]: (4, ('Tue Jan 20 03:03:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad (c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838)\nc01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838\nTue Jan 20 03:03:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad (c01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838)\nc01e92c24ad744b352303ff0c774e8663fd2a77c45834af5aff120271e973838\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.548 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[409d9073-ea70-4e20-a831-9818b2d71f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.549 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43d3be8f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.550 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:28 np0005588920 kernel: tap43d3be8f-90: left promiscuous mode
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.555 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c53b7b17-4c9f-47f6-8a22-c0a653177c8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.570 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2047901-5fd3-480b-b948-6024b377c749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.571 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8d657e-18ff-4b3d-991d-a334d0161549]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.585 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.586 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[46607445-443b-4f6d-a5b6-6a8441c761c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639914, 'reachable_time': 22149, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284281, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 systemd[1]: run-netns-ovnmeta\x2d43d3be8f\x2d9be1\x2d4892\x2dbbfe\x2dd0ba2d7157ad.mount: Deactivated successfully.
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.590 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:28 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:28.590 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[040b6328-6b33-4069-9509-bafec043de5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.614 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.618 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.647 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.690 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.690 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.691 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.691 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.717 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.720 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a0ce16c6-2b75-472f-a785-890fbb0d748e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:28 np0005588920 nova_compute[226886]: 2026-01-20 15:03:28.816 226890 DEBUG nova.policy [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b02a8ef6cc3946ceb2c8846aae2eae68', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0fc924d2df984301897e81920c5e192f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.083 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a0ce16c6-2b75-472f-a785-890fbb0d748e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.363s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.169 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] resizing rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.207 226890 DEBUG nova.compute.manager [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-unplugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.208 226890 DEBUG oslo_concurrency.lockutils [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.208 226890 DEBUG oslo_concurrency.lockutils [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.209 226890 DEBUG oslo_concurrency.lockutils [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.209 226890 DEBUG nova.compute.manager [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] No waiting events found dispatching network-vif-unplugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.209 226890 DEBUG nova.compute.manager [req-49adffcd-4428-4fca-98fe-2e2bbee9ea77 req-7493f303-ffdc-4de5-b596-f601f5bf2803 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-unplugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.312 226890 DEBUG nova.objects.instance [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'migration_context' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.344 226890 INFO nova.virt.libvirt.driver [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Deleting instance files /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560_del#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.345 226890 INFO nova.virt.libvirt.driver [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Deletion of /var/lib/nova/instances/c2a7aae5-0ef8-400a-acfe-2fbf83144560_del complete#033[00m
Jan 20 10:03:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:29.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.422 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.423 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Ensure instance console log exists: /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.423 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.423 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.424 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.884 226890 INFO nova.compute.manager [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Took 1.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.885 226890 DEBUG oslo.service.loopingcall [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.885 226890 DEBUG nova.compute.manager [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:03:29 np0005588920 nova_compute[226886]: 2026-01-20 15:03:29.885 226890 DEBUG nova.network.neutron [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:03:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:30.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 20 10:03:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:31 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:31.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.485 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.495 226890 DEBUG nova.compute.manager [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.496 226890 DEBUG oslo_concurrency.lockutils [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.496 226890 DEBUG oslo_concurrency.lockutils [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.496 226890 DEBUG oslo_concurrency.lockutils [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.496 226890 DEBUG nova.compute.manager [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] No waiting events found dispatching network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:31 np0005588920 nova_compute[226886]: 2026-01-20 15:03:31.497 226890 WARNING nova.compute.manager [req-77e6b6ae-9e4c-481b-bf98-91e197b8edbf req-03ad7342-a09f-4ce6-9c44-3f51eaf0d4c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received unexpected event network-vif-plugged-4c88fb15-8276-4e15-8d48-e7ff7412f9be for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:03:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:32.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 20 10:03:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:33.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:33 np0005588920 nova_compute[226886]: 2026-01-20 15:03:33.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 20 10:03:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:34.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:34 np0005588920 nova_compute[226886]: 2026-01-20 15:03:34.918 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Successfully created port: 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:03:34 np0005588920 nova_compute[226886]: 2026-01-20 15:03:34.926 226890 DEBUG nova.network.neutron [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updated VIF entry in instance network info cache for port 4c88fb15-8276-4e15-8d48-e7ff7412f9be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:03:34 np0005588920 nova_compute[226886]: 2026-01-20 15:03:34.926 226890 DEBUG nova.network.neutron [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [{"id": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "address": "fa:16:3e:47:19:22", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c88fb15-82", "ovs_interfaceid": "4c88fb15-8276-4e15-8d48-e7ff7412f9be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:34 np0005588920 nova_compute[226886]: 2026-01-20 15:03:34.980 226890 DEBUG oslo_concurrency.lockutils [req-b831ff9f-8d3a-4aa8-93a9-8b7c7e227240 req-df7d2cd0-a28a-49e7-9ab9-a9122fd6faef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2a7aae5-0ef8-400a-acfe-2fbf83144560" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:35 np0005588920 nova_compute[226886]: 2026-01-20 15:03:35.070 226890 DEBUG nova.network.neutron [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 20 10:03:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:35.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:35 np0005588920 podman[284469]: 2026-01-20 15:03:35.998713096 +0000 UTC m=+0.085136982 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 20 10:03:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:36.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.488 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.716 226890 DEBUG nova.compute.manager [req-bb78abf0-dac7-43ba-810a-9fed6729ff6e req-6ecc9480-e59c-4970-8bc9-f556275b4a89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Received event network-vif-deleted-4c88fb15-8276-4e15-8d48-e7ff7412f9be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.716 226890 INFO nova.compute.manager [req-bb78abf0-dac7-43ba-810a-9fed6729ff6e req-6ecc9480-e59c-4970-8bc9-f556275b4a89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Neutron deleted interface 4c88fb15-8276-4e15-8d48-e7ff7412f9be; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.716 226890 DEBUG nova.network.neutron [req-bb78abf0-dac7-43ba-810a-9fed6729ff6e req-6ecc9480-e59c-4970-8bc9-f556275b4a89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.732 226890 INFO nova.compute.manager [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Took 6.85 seconds to deallocate network for instance.#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.788 226890 DEBUG nova.compute.manager [req-bb78abf0-dac7-43ba-810a-9fed6729ff6e req-6ecc9480-e59c-4970-8bc9-f556275b4a89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Detach interface failed, port_id=4c88fb15-8276-4e15-8d48-e7ff7412f9be, reason: Instance c2a7aae5-0ef8-400a-acfe-2fbf83144560 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.910 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:36 np0005588920 nova_compute[226886]: 2026-01-20 15:03:36.911 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.039 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:37.043 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:37.044 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.056 226890 DEBUG oslo_concurrency.processutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:37.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1143980325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.496 226890 DEBUG oslo_concurrency.processutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.502 226890 DEBUG nova.compute.provider_tree [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.586 226890 DEBUG nova.scheduler.client.report [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.628 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.804 226890 INFO nova.scheduler.client.report [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Deleted allocations for instance c2a7aae5-0ef8-400a-acfe-2fbf83144560#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.914 226890 DEBUG oslo_concurrency.lockutils [None req-547dcaed-ed2c-41f2-84cc-96b6bf59d815 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "c2a7aae5-0ef8-400a-acfe-2fbf83144560" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.919 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Successfully updated port: 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.941 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.941 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:37 np0005588920 nova_compute[226886]: 2026-01-20 15:03:37.941 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:03:38 np0005588920 nova_compute[226886]: 2026-01-20 15:03:38.064 226890 DEBUG nova.compute.manager [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:38 np0005588920 nova_compute[226886]: 2026-01-20 15:03:38.065 226890 DEBUG nova.compute.manager [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing instance network info cache due to event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:03:38 np0005588920 nova_compute[226886]: 2026-01-20 15:03:38.065 226890 DEBUG oslo_concurrency.lockutils [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:38.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:38 np0005588920 nova_compute[226886]: 2026-01-20 15:03:38.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:38 np0005588920 nova_compute[226886]: 2026-01-20 15:03:38.824 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:03:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 20 10:03:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:39.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 20 10:03:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:40.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.090 226890 DEBUG nova.network.neutron [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.147 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.147 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance network_info: |[{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.148 226890 DEBUG oslo_concurrency.lockutils [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.148 226890 DEBUG nova.network.neutron [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.151 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start _get_guest_xml network_info=[{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.157 226890 WARNING nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.162 226890 DEBUG nova.virt.libvirt.host [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.163 226890 DEBUG nova.virt.libvirt.host [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.169 226890 DEBUG nova.virt.libvirt.host [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.169 226890 DEBUG nova.virt.libvirt.host [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.171 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.171 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.172 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.172 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.172 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.172 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.173 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.173 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.173 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.174 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.174 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.174 226890 DEBUG nova.virt.hardware [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.178 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:41.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1625105208' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.664 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.689 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:41 np0005588920 nova_compute[226886]: 2026-01-20 15:03:41.692 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1704404478' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.142 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.143 226890 DEBUG nova.virt.libvirt.vif [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKbSg+U5D2P4vAhN93N9KUHNV5uhMaQWWRL1/dgo18CRR+13PC7EHc+NfhsO3rchRXZsX8fKAmtn1X9kzXWRANuFYEKLsCK/cad6C56A1ZIn2STxVc8j8348CriP8hVdg==',key_name='tempest-TestShelveInstance-624896822',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:03:28Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.144 226890 DEBUG nova.network.os_vif_util [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.145 226890 DEBUG nova.network.os_vif_util [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.146 226890 DEBUG nova.objects.instance [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'pci_devices' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.166 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <uuid>a0ce16c6-2b75-472f-a785-890fbb0d748e</uuid>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <name>instance-0000009e</name>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestShelveInstance-server-1036076849</nova:name>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:03:41</nova:creationTime>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:user uuid="b02a8ef6cc3946ceb2c8846aae2eae68">tempest-TestShelveInstance-1425544575-project-member</nova:user>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:project uuid="0fc924d2df984301897e81920c5e192f">tempest-TestShelveInstance-1425544575</nova:project>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <nova:port uuid="7b2aa669-8f25-4d67-b56d-f9a96e1774a4">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="serial">a0ce16c6-2b75-472f-a785-890fbb0d748e</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="uuid">a0ce16c6-2b75-472f-a785-890fbb0d748e</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:f8:b3:b3"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <target dev="tap7b2aa669-8f"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/console.log" append="off"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:03:42 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:03:42 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:03:42 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:03:42 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.168 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Preparing to wait for external event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.168 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.168 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.169 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.169 226890 DEBUG nova.virt.libvirt.vif [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKbSg+U5D2P4vAhN93N9KUHNV5uhMaQWWRL1/dgo18CRR+13PC7EHc+NfhsO3rchRXZsX8fKAmtn1X9kzXWRANuFYEKLsCK/cad6C56A1ZIn2STxVc8j8348CriP8hVdg==',key_name='tempest-TestShelveInstance-624896822',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:03:28Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.170 226890 DEBUG nova.network.os_vif_util [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.170 226890 DEBUG nova.network.os_vif_util [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.171 226890 DEBUG os_vif [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.172 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.172 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.175 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.175 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b2aa669-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.176 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b2aa669-8f, col_values=(('external_ids', {'iface-id': '7b2aa669-8f25-4d67-b56d-f9a96e1774a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:b3:b3', 'vm-uuid': 'a0ce16c6-2b75-472f-a785-890fbb0d748e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.223 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:42 np0005588920 NetworkManager[49076]: <info>  [1768921422.2243] manager: (tap7b2aa669-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.231 226890 INFO os_vif [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f')#033[00m
Jan 20 10:03:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:42.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.791 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.791 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.791 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No VIF found with MAC fa:16:3e:f8:b3:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.792 226890 INFO nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Using config drive#033[00m
Jan 20 10:03:42 np0005588920 nova_compute[226886]: 2026-01-20 15:03:42.819 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:43.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.470 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921408.4689872, c2a7aae5-0ef8-400a-acfe-2fbf83144560 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.471 226890 INFO nova.compute.manager [-] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.500 226890 DEBUG nova.compute.manager [None req-dc3f65ac-5c87-4227-a28e-1e10f45f0b13 - - - - - -] [instance: c2a7aae5-0ef8-400a-acfe-2fbf83144560] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.789 226890 INFO nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating config drive at /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.793 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgu_fsrha execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.943 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgu_fsrha" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.969 226890 DEBUG nova.storage.rbd_utils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:43 np0005588920 nova_compute[226886]: 2026-01-20 15:03:43.972 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.046 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.134 226890 DEBUG oslo_concurrency.processutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.135 226890 INFO nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deleting local config drive /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config because it was imported into RBD.#033[00m
Jan 20 10:03:44 np0005588920 kernel: tap7b2aa669-8f: entered promiscuous mode
Jan 20 10:03:44 np0005588920 NetworkManager[49076]: <info>  [1768921424.1796] manager: (tap7b2aa669-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/351)
Jan 20 10:03:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:44Z|00720|binding|INFO|Claiming lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for this chassis.
Jan 20 10:03:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:44Z|00721|binding|INFO|7b2aa669-8f25-4d67-b56d-f9a96e1774a4: Claiming fa:16:3e:f8:b3:b3 10.100.0.12
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.179 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:44Z|00722|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 ovn-installed in OVS
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.201 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:44 np0005588920 systemd-udevd[284655]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:03:44 np0005588920 systemd-machined[196121]: New machine qemu-73-instance-0000009e.
Jan 20 10:03:44 np0005588920 NetworkManager[49076]: <info>  [1768921424.2209] device (tap7b2aa669-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:03:44 np0005588920 NetworkManager[49076]: <info>  [1768921424.2216] device (tap7b2aa669-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:03:44 np0005588920 systemd[1]: Started Virtual Machine qemu-73-instance-0000009e.
Jan 20 10:03:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:44.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.551 226890 DEBUG nova.network.neutron [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated VIF entry in instance network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.552 226890 DEBUG nova.network.neutron [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.708 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921424.7075918, a0ce16c6-2b75-472f-a785-890fbb0d748e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:44 np0005588920 nova_compute[226886]: 2026-01-20 15:03:44.708 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:03:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:44Z|00723|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 up in Southbound
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.940 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:b3:b3 10.100.0.12'], port_security=['fa:16:3e:f8:b3:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a0ce16c6-2b75-472f-a785-890fbb0d748e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '73ae63f6-3a5a-4604-9d46-53d9b9e08225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7b2aa669-8f25-4d67-b56d-f9a96e1774a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.941 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 bound to our chassis#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.943 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f434e83-45c8-454d-820b-af39b696a1d5#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.955 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d00fd62-9e9e-4b6c-9c74-37180c145ba0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.955 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f434e83-41 in ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.959 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f434e83-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.959 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c08c16ad-3d2e-4f95-9140-11ae733fcea0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.960 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[020125fc-09b7-42a0-bce6-2a41fbf10ef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.971 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0b50cf-f5d8-42ed-b104-523e145a81b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:44.985 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[71bc27ec-6188-46c7-8af7-80994f93e39e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.013 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[30cd497d-87cd-4f67-a32c-21924bde323a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 NetworkManager[49076]: <info>  [1768921425.0196] manager: (tap0f434e83-40): new Veth device (/org/freedesktop/NetworkManager/Devices/352)
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.018 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48c3a318-774f-4679-85a7-ee16bdcdda95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.052 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a2406489-cfe4-4493-a66e-2cdb36411bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.054 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[162d385d-062c-46f7-ba55-a1797bbd74d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 podman[284709]: 2026-01-20 15:03:45.061481537 +0000 UTC m=+0.052624463 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:03:45 np0005588920 NetworkManager[49076]: <info>  [1768921425.0784] device (tap0f434e83-40): carrier: link connected
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.083 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ae85cde7-1568-412e-915e-f003b41a75e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.099 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[531807b1-dd08-4889-a57c-6112737a22d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646676, 'reachable_time': 42877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284747, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.114 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3a27e8c8-6382-44da-ada5-bc3c0448bcfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:128d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646676, 'tstamp': 646676}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284748, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.133 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e3284b-fdb2-4b50-9557-83295531f24c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646676, 'reachable_time': 42877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284749, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.165 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a56384-2f05-4c64-87eb-07973c0b8528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.170 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.175 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921424.7097409, a0ce16c6-2b75-472f-a785-890fbb0d748e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.175 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.182 226890 DEBUG oslo_concurrency.lockutils [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.182 226890 DEBUG oslo_concurrency.lockutils [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.184 226890 DEBUG oslo_concurrency.lockutils [req-e58a2dab-2a56-420e-8d0f-73192d050edd req-679bdbcb-1d22-48be-8eb7-e48f2e40b98a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.208 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.209 226890 INFO nova.compute.manager [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Detaching volume d50f7be5-09c1-4898-894c-704176a797ac#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.215 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.273 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.333 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[21971abf-5221-43e5-bef7-4610435d8bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.335 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.335 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.335 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f434e83-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.337 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:45 np0005588920 kernel: tap0f434e83-40: entered promiscuous mode
Jan 20 10:03:45 np0005588920 NetworkManager[49076]: <info>  [1768921425.3381] manager: (tap0f434e83-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.340 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f434e83-40, col_values=(('external_ids', {'iface-id': '6133323e-bf50-4bbd-bc0b-9ecf135d8cd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:45Z|00724|binding|INFO|Releasing lport 6133323e-bf50-4bbd-bc0b-9ecf135d8cd5 from this chassis (sb_readonly=0)
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.341 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.356 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.357 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2d71c2e6-ddf0-40c9-a22b-65733316ad8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.358 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:03:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:45.359 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'env', 'PROCESS_TAG=haproxy-0f434e83-45c8-454d-820b-af39b696a1d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f434e83-45c8-454d-820b-af39b696a1d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:03:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:45.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:45 np0005588920 podman[284782]: 2026-01-20 15:03:45.725696539 +0000 UTC m=+0.045231712 container create bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.748 226890 DEBUG nova.compute.manager [req-08ba86ea-08f7-4c44-9765-80dfe4788f5c req-64e87159-3121-4dd6-964a-2d2a2f208650 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.749 226890 DEBUG oslo_concurrency.lockutils [req-08ba86ea-08f7-4c44-9765-80dfe4788f5c req-64e87159-3121-4dd6-964a-2d2a2f208650 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.749 226890 DEBUG oslo_concurrency.lockutils [req-08ba86ea-08f7-4c44-9765-80dfe4788f5c req-64e87159-3121-4dd6-964a-2d2a2f208650 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.749 226890 DEBUG oslo_concurrency.lockutils [req-08ba86ea-08f7-4c44-9765-80dfe4788f5c req-64e87159-3121-4dd6-964a-2d2a2f208650 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.749 226890 DEBUG nova.compute.manager [req-08ba86ea-08f7-4c44-9765-80dfe4788f5c req-64e87159-3121-4dd6-964a-2d2a2f208650 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Processing event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.750 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.752 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921425.7524493, a0ce16c6-2b75-472f-a785-890fbb0d748e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.753 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.754 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:03:45 np0005588920 systemd[1]: Started libpod-conmon-bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5.scope.
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.757 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance spawned successfully.#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.757 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.786 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:45 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:03:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3030b5d9e15e87195925e49258495c7afaaccaffb05a0f967add90fb4cab3b51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:03:45 np0005588920 podman[284782]: 2026-01-20 15:03:45.702059994 +0000 UTC m=+0.021595187 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.798 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.802 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.802 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.803 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.803 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 podman[284782]: 2026-01-20 15:03:45.804034295 +0000 UTC m=+0.123569498 container init bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.804 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.804 226890 DEBUG nova.virt.libvirt.driver [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.809 226890 INFO nova.virt.block_device [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Attempting to driver detach volume d50f7be5-09c1-4898-894c-704176a797ac from mountpoint /dev/vdb#033[00m
Jan 20 10:03:45 np0005588920 podman[284782]: 2026-01-20 15:03:45.810944893 +0000 UTC m=+0.130480066 container start bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.817 226890 DEBUG nova.virt.libvirt.driver [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Attempting to detach device vdb from instance 75368220-ff38-456b-a0e6-ae1c02625514 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.818 226890 DEBUG nova.virt.libvirt.guest [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d50f7be5-09c1-4898-894c-704176a797ac">
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <serial>d50f7be5-09c1-4898-894c-704176a797ac</serial>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:03:45 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.825 226890 INFO nova.virt.libvirt.driver [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdb from instance 75368220-ff38-456b-a0e6-ae1c02625514 from the persistent domain config.#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.825 226890 DEBUG nova.virt.libvirt.driver [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 75368220-ff38-456b-a0e6-ae1c02625514 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.825 226890 DEBUG nova.virt.libvirt.guest [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d50f7be5-09c1-4898-894c-704176a797ac">
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <serial>d50f7be5-09c1-4898-894c-704176a797ac</serial>
Jan 20 10:03:45 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:03:45 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:03:45 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:03:45 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [NOTICE]   (284803) : New worker (284805) forked
Jan 20 10:03:45 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [NOTICE]   (284803) : Loading success.
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.839 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.881 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921425.880807, 75368220-ff38-456b-a0e6-ae1c02625514 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.884 226890 DEBUG nova.virt.libvirt.driver [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 75368220-ff38-456b-a0e6-ae1c02625514 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.886 226890 INFO nova.virt.libvirt.driver [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdb from instance 75368220-ff38-456b-a0e6-ae1c02625514 from the live domain config.#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.925 226890 INFO nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Took 17.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:03:45 np0005588920 nova_compute[226886]: 2026-01-20 15:03:45.926 226890 DEBUG nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:46 np0005588920 nova_compute[226886]: 2026-01-20 15:03:46.096 226890 INFO nova.compute.manager [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Took 19.15 seconds to build instance.#033[00m
Jan 20 10:03:46 np0005588920 nova_compute[226886]: 2026-01-20 15:03:46.118 226890 DEBUG oslo_concurrency.lockutils [None req-d6638d9a-7b4c-4d3e-83dc-9479b22a61be b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:46 np0005588920 nova_compute[226886]: 2026-01-20 15:03:46.195 226890 DEBUG nova.objects.instance [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:46 np0005588920 nova_compute[226886]: 2026-01-20 15:03:46.250 226890 DEBUG oslo_concurrency.lockutils [None req-1643914c-01d5-4224-9d63-2f835fc3f317 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:46.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:46 np0005588920 nova_compute[226886]: 2026-01-20 15:03:46.491 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.223 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:47.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.770 226890 DEBUG oslo_concurrency.lockutils [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.770 226890 DEBUG oslo_concurrency.lockutils [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.771 226890 DEBUG nova.compute.manager [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.774 226890 DEBUG nova.compute.manager [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.774 226890 DEBUG nova.objects.instance [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.839 226890 DEBUG nova.virt.libvirt.driver [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.896 226890 DEBUG nova.compute.manager [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.897 226890 DEBUG oslo_concurrency.lockutils [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.897 226890 DEBUG oslo_concurrency.lockutils [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.897 226890 DEBUG oslo_concurrency.lockutils [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.898 226890 DEBUG nova.compute.manager [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:47 np0005588920 nova_compute[226886]: 2026-01-20 15:03:47.898 226890 WARNING nova.compute.manager [req-d7f77a62-b6d6-4677-b8ae-2a7b36d7381f req-d92f3dc4-164a-4786-a7b3-59b0194de4bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received unexpected event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:03:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:48.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:48 np0005588920 nova_compute[226886]: 2026-01-20 15:03:48.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:48 np0005588920 nova_compute[226886]: 2026-01-20 15:03:48.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:03:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:49.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:50 np0005588920 kernel: tap27ba7c79-86 (unregistering): left promiscuous mode
Jan 20 10:03:50 np0005588920 NetworkManager[49076]: <info>  [1768921430.1829] device (tap27ba7c79-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.189 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:50Z|00725|binding|INFO|Releasing lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d from this chassis (sb_readonly=0)
Jan 20 10:03:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:50Z|00726|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d down in Southbound
Jan 20 10:03:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:03:50Z|00727|binding|INFO|Removing iface tap27ba7c79-86 ovn-installed in OVS
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.206 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 20 10:03:50 np0005588920 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Consumed 14.796s CPU time.
Jan 20 10:03:50 np0005588920 systemd-machined[196121]: Machine qemu-72-instance-0000009b terminated.
Jan 20 10:03:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:50.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.406 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.409 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a unbound from our chassis#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.411 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.412 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e6676137-ea10-41fe-98fc-52100e2073d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.413 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace which is not needed anymore#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.421 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.444 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.444 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:50 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [NOTICE]   (283833) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:50 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [NOTICE]   (283833) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:50 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [WARNING]  (283833) : Exiting Master process...
Jan 20 10:03:50 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [ALERT]    (283833) : Current worker (283835) exited with code 143 (Terminated)
Jan 20 10:03:50 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[283829]: [WARNING]  (283833) : All workers exited. Exiting... (0)
Jan 20 10:03:50 np0005588920 systemd[1]: libpod-7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc.scope: Deactivated successfully.
Jan 20 10:03:50 np0005588920 podman[284849]: 2026-01-20 15:03:50.552984455 +0000 UTC m=+0.050755210 container died 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:03:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay-7ddbd2a1692e68c4f9d445f60b5eb8fdf5eb24aa122bc4b76b42eafde183bea0-merged.mount: Deactivated successfully.
Jan 20 10:03:50 np0005588920 podman[284849]: 2026-01-20 15:03:50.596423755 +0000 UTC m=+0.094194510 container cleanup 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:03:50 np0005588920 systemd[1]: libpod-conmon-7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc.scope: Deactivated successfully.
Jan 20 10:03:50 np0005588920 podman[284880]: 2026-01-20 15:03:50.663356796 +0000 UTC m=+0.040272561 container remove 7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.669 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[70e18f11-da6f-44f2-b11d-0b7cd6317915]: (4, ('Tue Jan 20 03:03:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc)\n7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc\nTue Jan 20 03:03:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc)\n7aa36b0b9ea3938022efe5bac49ee7a5bb77df16b26d948ad3873a9c3d6dcabc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.670 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[00e961e3-391a-430c-9e6c-a543e98d5309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.671 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.672 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 kernel: tap89fdd65f-30: left promiscuous mode
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.690 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.695 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48efac86-3842-47a8-90be-b13a0fc02d83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.712 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d96297bf-11c8-4092-94bd-3332cbe08f23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.713 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a0aa3c04-9e4f-4280-b042-4634baf859a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.727 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc61dea8-21b0-41cf-a44e-a97de7438074]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642972, 'reachable_time': 38637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284899, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 systemd[1]: run-netns-ovnmeta\x2d89fdd65f\x2d3dd2\x2d4375\x2da946\x2d3c5de73cc24a.mount: Deactivated successfully.
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.729 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:03:50.729 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[179ac307-bd62-4a2f-aa46-8ce70e057391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.857 226890 INFO nova.virt.libvirt.driver [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.864 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance destroyed successfully.#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.865 226890 DEBUG nova.objects.instance [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:50 np0005588920 nova_compute[226886]: 2026-01-20 15:03:50.894 226890 DEBUG nova.compute.manager [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:51.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:51 np0005588920 nova_compute[226886]: 2026-01-20 15:03:51.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.213 226890 DEBUG nova.compute.manager [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.214 226890 DEBUG oslo_concurrency.lockutils [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.214 226890 DEBUG oslo_concurrency.lockutils [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.215 226890 DEBUG oslo_concurrency.lockutils [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.215 226890 DEBUG nova.compute.manager [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.215 226890 WARNING nova.compute.manager [req-e596002f-114f-4dec-b1b0-3f0827cb0fac req-ca7454a2-f617-409a-8685-bd43ae4e8a50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state powering-off.#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.336 226890 DEBUG oslo_concurrency.lockutils [None req-af56f533-f455-4548-bb4d-b1396b2d824b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 4.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:52.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.764 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.765 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.765 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.765 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:03:52 np0005588920 nova_compute[226886]: 2026-01-20 15:03:52.766 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.296 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:53.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.428 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.428 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.434 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.434 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.459 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.481 226890 DEBUG oslo_concurrency.lockutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.481 226890 DEBUG oslo_concurrency.lockutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquired lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.482 226890 DEBUG nova.network.neutron [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.482 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'info_cache' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.633 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.635 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4059MB free_disk=20.830543518066406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.635 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:53 np0005588920 nova_compute[226886]: 2026-01-20 15:03:53.635 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:54.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.311 226890 DEBUG nova.compute.manager [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.311 226890 DEBUG oslo_concurrency.lockutils [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.311 226890 DEBUG oslo_concurrency.lockutils [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.312 226890 DEBUG oslo_concurrency.lockutils [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.312 226890 DEBUG nova.compute.manager [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:55 np0005588920 nova_compute[226886]: 2026-01-20 15:03:55.312 226890 WARNING nova.compute.manager [req-d9770db7-f74d-4567-931d-950decf90e00 req-4e00c23a-6648-455a-bf9f-0c444b39cb7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 20 10:03:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:55.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:56.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.497 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.520805) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436520855, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 2502, "num_deletes": 257, "total_data_size": 5667805, "memory_usage": 5731184, "flush_reason": "Manual Compaction"}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436556400, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 3702117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55298, "largest_seqno": 57794, "table_properties": {"data_size": 3691912, "index_size": 6507, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21936, "raw_average_key_size": 21, "raw_value_size": 3671190, "raw_average_value_size": 3519, "num_data_blocks": 281, "num_entries": 1043, "num_filter_entries": 1043, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921252, "oldest_key_time": 1768921252, "file_creation_time": 1768921436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 35635 microseconds, and 7513 cpu microseconds.
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.556438) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 3702117 bytes OK
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.556454) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.558336) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.558349) EVENT_LOG_v1 {"time_micros": 1768921436558345, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.558364) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 5656670, prev total WAL file size 5656670, number of live WAL files 2.
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.559743) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(3615KB)], [108(11MB)]
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436559837, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 15695308, "oldest_snapshot_seqno": -1}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 8530 keys, 13824863 bytes, temperature: kUnknown
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436733847, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 13824863, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13765579, "index_size": 36836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21381, "raw_key_size": 219952, "raw_average_key_size": 25, "raw_value_size": 13611450, "raw_average_value_size": 1595, "num_data_blocks": 1451, "num_entries": 8530, "num_filter_entries": 8530, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.736 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 75368220-ff38-456b-a0e6-ae1c02625514 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.737 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance a0ce16c6-2b75-472f-a785-890fbb0d748e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.737 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.738 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.734087) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 13824863 bytes
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.787121) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.2 rd, 79.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 11.4 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 9062, records dropped: 532 output_compression: NoCompression
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.787166) EVENT_LOG_v1 {"time_micros": 1768921436787150, "job": 68, "event": "compaction_finished", "compaction_time_micros": 174078, "compaction_time_cpu_micros": 33930, "output_level": 6, "num_output_files": 1, "total_output_size": 13824863, "num_input_records": 9062, "num_output_records": 8530, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436788072, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436790364, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.559451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790426) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588920 nova_compute[226886]: 2026-01-20 15:03:56.980 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.012 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.013 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.122 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.152 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.308 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:57.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/571537298' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.793 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:57 np0005588920 nova_compute[226886]: 2026-01-20 15:03:57.799 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:58.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.535 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.632 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.633 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.965 226890 DEBUG nova.compute.manager [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.965 226890 DEBUG nova.compute.manager [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing instance network info cache due to event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.966 226890 DEBUG oslo_concurrency.lockutils [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.966 226890 DEBUG oslo_concurrency.lockutils [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:58 np0005588920 nova_compute[226886]: 2026-01-20 15:03:58.966 226890 DEBUG nova.network.neutron [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:03:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:03:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:59.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.634 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.634 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.634 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.634 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.635 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:03:59 np0005588920 nova_compute[226886]: 2026-01-20 15:03:59.745 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:04:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:00Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:b3:b3 10.100.0.12
Jan 20 10:04:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:00Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:b3:b3 10.100.0.12
Jan 20 10:04:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:00 np0005588920 nova_compute[226886]: 2026-01-20 15:04:00.936 226890 DEBUG nova.network.neutron [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.010 226890 DEBUG oslo_concurrency.lockutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Releasing lock "refresh_cache-75368220-ff38-456b-a0e6-ae1c02625514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.047 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance destroyed successfully.#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.048 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.120 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'resources' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.135 226890 DEBUG nova.virt.libvirt.vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.136 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.136 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.137 226890 DEBUG os_vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.139 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27ba7c79-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.142 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.144 226890 INFO os_vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.150 226890 DEBUG nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start _get_guest_xml network_info=[{"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.153 226890 WARNING nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.178 226890 DEBUG nova.virt.libvirt.host [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.179 226890 DEBUG nova.virt.libvirt.host [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.186 226890 DEBUG nova.virt.libvirt.host [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.187 226890 DEBUG nova.virt.libvirt.host [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.188 226890 DEBUG nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.188 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.189 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.189 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.189 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.189 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.189 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.190 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.190 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.190 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.190 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.191 226890 DEBUG nova.virt.hardware [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.191 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.212 226890 DEBUG oslo_concurrency.processutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:01.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.501 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3862250939' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.661 226890 DEBUG oslo_concurrency.processutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.694 226890 DEBUG oslo_concurrency.processutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:01 np0005588920 nova_compute[226886]: 2026-01-20 15:04:01.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:04:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/656038064' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.144 226890 DEBUG oslo_concurrency.processutils [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.146 226890 DEBUG nova.virt.libvirt.vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.147 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.148 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.150 226890 DEBUG nova.objects.instance [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.773 226890 DEBUG nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <uuid>75368220-ff38-456b-a0e6-ae1c02625514</uuid>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <name>instance-0000009b</name>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeTestJSON-server-284183767</nova:name>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:04:01</nova:creationTime>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:user uuid="912329b1a6ad42bdb72e952c03983bdf">tempest-AttachVolumeTestJSON-583320363-project-member</nova:user>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:project uuid="96f7b14c2a9348f08305fe232df2a603">tempest-AttachVolumeTestJSON-583320363</nova:project>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <nova:port uuid="27ba7c79-863a-4084-a5df-ee7a70ec6e0d">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="serial">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="uuid">75368220-ff38-456b-a0e6-ae1c02625514</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/75368220-ff38-456b-a0e6-ae1c02625514_disk.config">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:3d:87:66"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <target dev="tap27ba7c79-86"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514/console.log" append="off"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:04:02 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:04:02 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:04:02 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:04:02 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.775 226890 DEBUG nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.775 226890 DEBUG nova.virt.libvirt.driver [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.776 226890 DEBUG nova.virt.libvirt.vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.777 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.778 226890 DEBUG nova.network.os_vif_util [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.778 226890 DEBUG os_vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.779 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.780 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.780 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.783 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.784 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27ba7c79-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.785 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27ba7c79-86, col_values=(('external_ids', {'iface-id': '27ba7c79-863a-4084-a5df-ee7a70ec6e0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:87:66', 'vm-uuid': '75368220-ff38-456b-a0e6-ae1c02625514'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:02 np0005588920 NetworkManager[49076]: <info>  [1768921442.8219] manager: (tap27ba7c79-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.821 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.824 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.825 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.827 226890 INFO os_vif [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:04:02 np0005588920 NetworkManager[49076]: <info>  [1768921442.9490] manager: (tap27ba7c79-86): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Jan 20 10:04:02 np0005588920 kernel: tap27ba7c79-86: entered promiscuous mode
Jan 20 10:04:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:02Z|00728|binding|INFO|Claiming lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d for this chassis.
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.951 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:02Z|00729|binding|INFO|27ba7c79-863a-4084-a5df-ee7a70ec6e0d: Claiming fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:04:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:02Z|00730|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d ovn-installed in OVS
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 nova_compute[226886]: 2026-01-20 15:04:02.971 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588920 systemd-udevd[285021]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:02 np0005588920 NetworkManager[49076]: <info>  [1768921442.9911] device (tap27ba7c79-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:02 np0005588920 NetworkManager[49076]: <info>  [1768921442.9919] device (tap27ba7c79-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:03 np0005588920 systemd-machined[196121]: New machine qemu-74-instance-0000009b.
Jan 20 10:04:03 np0005588920 systemd[1]: Started Virtual Machine qemu-74-instance-0000009b.
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.126 226890 DEBUG nova.network.neutron [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated VIF entry in instance network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.127 226890 DEBUG nova.network.neutron [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:03.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.551 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 75368220-ff38-456b-a0e6-ae1c02625514 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.552 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921443.5514212, 75368220-ff38-456b-a0e6-ae1c02625514 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.552 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.554 226890 DEBUG nova.compute.manager [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.560 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance rebooted successfully.#033[00m
Jan 20 10:04:03 np0005588920 nova_compute[226886]: 2026-01-20 15:04:03.561 226890 DEBUG nova.compute.manager [None req-99ab8cc3-bde3-4d5a-b748-f08dc6a95cb2 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.071 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '7', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:04 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:04Z|00731|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d up in Southbound
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.073 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a bound to our chassis#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.075 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.089 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f6466233-3a17-46b0-a741-b0c371fce94d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.090 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89fdd65f-31 in ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.093 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89fdd65f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.093 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1a78fc-de87-4d57-948c-70c04deb4593]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.094 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2084999d-87fb-4d9d-98d7-60eeb93ac75a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.106 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[302bd270-c014-4ed7-bb10-38ded0dab053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.122 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb2b2bc-0a90-4de6-90be-4e1c9d6dbe61]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.157 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7a82a72d-b59c-4f7c-b1f0-7dcab758db23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 systemd-udevd[285023]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:04 np0005588920 NetworkManager[49076]: <info>  [1768921444.1645] manager: (tap89fdd65f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/356)
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.166 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c3dd78-62b8-406b-9984-55ce6a51f79d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.210 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fd2625-83db-45bc-891a-e735fee39bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.214 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[df0aed9c-e33d-46d4-8786-e79b904d4728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 NetworkManager[49076]: <info>  [1768921444.2443] device (tap89fdd65f-30): carrier: link connected
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.253 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1b6485-dc69-4c8b-9fca-f085f7ed7f11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.266 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.270 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.274 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7391a3a1-1de7-41b6-8066-812ef913b9da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648593, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285099, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.296 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f676e347-0b0b-4918-a86d-b7f9588b7d72]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:d33d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 648593, 'tstamp': 648593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285100, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.318 226890 DEBUG oslo_concurrency.lockutils [req-7848d1e0-5eeb-46c1-9a65-04ad322b6995 req-61e692e9-a61d-4e7a-b9c7-fe05d89edf32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.318 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7771375e-4e99-4a27-a356-3c750ff23db1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648593, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285101, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.355 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d71dae44-ba44-4a69-b893-02b99b7e9859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:04.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.421 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac1ed4b-6e52-462f-980e-680eb0a15b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.422 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.423 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.423 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89fdd65f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.425 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:04 np0005588920 NetworkManager[49076]: <info>  [1768921444.4260] manager: (tap89fdd65f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 20 10:04:04 np0005588920 kernel: tap89fdd65f-30: entered promiscuous mode
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.428 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89fdd65f-30, col_values=(('external_ids', {'iface-id': '58f1013f-2d8d-46a7-97e6-2062537e7f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.429 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:04 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:04Z|00732|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.445 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.447 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.448 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5802dfa2-67dd-4524-be02-a3aaeab28a28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.448 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:04:04 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:04.450 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'env', 'PROCESS_TAG=haproxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89fdd65f-3dd2-4375-a946-3c5de73cc24a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.718 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.720 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921443.553826, 75368220-ff38-456b-a0e6-ae1c02625514 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.721 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.793 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:04 np0005588920 nova_compute[226886]: 2026-01-20 15:04:04.800 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:04 np0005588920 podman[285135]: 2026-01-20 15:04:04.875358961 +0000 UTC m=+0.070891834 container create e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:04:04 np0005588920 podman[285135]: 2026-01-20 15:04:04.834436943 +0000 UTC m=+0.029969846 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:04:04 np0005588920 systemd[1]: Started libpod-conmon-e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7.scope.
Jan 20 10:04:04 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:04:04 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7974acc1c70817dbf985188f32174d4dcdcf0e5318a8ff7f03dd0a6fe7643be0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:04:05 np0005588920 podman[285135]: 2026-01-20 15:04:05.034102493 +0000 UTC m=+0.229635386 container init e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 10:04:05 np0005588920 podman[285135]: 2026-01-20 15:04:05.039790146 +0000 UTC m=+0.235323019 container start e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:04:05 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [NOTICE]   (285154) : New worker (285156) forked
Jan 20 10:04:05 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [NOTICE]   (285154) : Loading success.
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.167 226890 DEBUG nova.compute.manager [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.167 226890 DEBUG oslo_concurrency.lockutils [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.168 226890 DEBUG oslo_concurrency.lockutils [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.168 226890 DEBUG oslo_concurrency.lockutils [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.168 226890 DEBUG nova.compute.manager [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:05 np0005588920 nova_compute[226886]: 2026-01-20 15:04:05.168 226890 WARNING nova.compute.manager [req-27df6cf5-2d5d-4cd3-b09c-d32b69f6d31c req-db95be0f-8210-4886-9776-e017b0eae4b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:05.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:06.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:06 np0005588920 nova_compute[226886]: 2026-01-20 15:04:06.503 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:07 np0005588920 podman[285165]: 2026-01-20 15:04:07.022087036 +0000 UTC m=+0.108493059 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 20 10:04:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:07.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:07 np0005588920 nova_compute[226886]: 2026-01-20 15:04:07.822 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.040 226890 DEBUG nova.compute.manager [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.041 226890 DEBUG oslo_concurrency.lockutils [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.041 226890 DEBUG oslo_concurrency.lockutils [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.041 226890 DEBUG oslo_concurrency.lockutils [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.042 226890 DEBUG nova.compute.manager [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.042 226890 WARNING nova.compute.manager [req-f8a22397-500a-4643-b0f5-a51533d6936c req-b8c739da-de91-4dff-8788-f79fa4485e8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:08.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:08 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:08Z|00733|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:04:08 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:08Z|00734|binding|INFO|Releasing lport 6133323e-bf50-4bbd-bc0b-9ecf135d8cd5 from this chassis (sb_readonly=0)
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.882 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.883 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.884 226890 INFO nova.compute.manager [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Shelving#033[00m
Jan 20 10:04:08 np0005588920 nova_compute[226886]: 2026-01-20 15:04:08.925 226890 DEBUG nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:04:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:09.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:10.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:11 np0005588920 kernel: tap7b2aa669-8f (unregistering): left promiscuous mode
Jan 20 10:04:11 np0005588920 NetworkManager[49076]: <info>  [1768921451.1883] device (tap7b2aa669-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:04:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:11Z|00735|binding|INFO|Releasing lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 from this chassis (sb_readonly=0)
Jan 20 10:04:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:11Z|00736|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 down in Southbound
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:11Z|00737|binding|INFO|Removing iface tap7b2aa669-8f ovn-installed in OVS
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:11 np0005588920 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 20 10:04:11 np0005588920 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009e.scope: Consumed 13.850s CPU time.
Jan 20 10:04:11 np0005588920 systemd-machined[196121]: Machine qemu-73-instance-0000009e terminated.
Jan 20 10:04:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:11.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.505 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.942 226890 INFO nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.947 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance destroyed successfully.#033[00m
Jan 20 10:04:11 np0005588920 nova_compute[226886]: 2026-01-20 15:04:11.947 226890 DEBUG nova.objects.instance [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'numa_topology' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:11.958 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:b3:b3 10.100.0.12'], port_security=['fa:16:3e:f8:b3:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a0ce16c6-2b75-472f-a785-890fbb0d748e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '73ae63f6-3a5a-4604-9d46-53d9b9e08225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7b2aa669-8f25-4d67-b56d-f9a96e1774a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:11.959 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 unbound from our chassis#033[00m
Jan 20 10:04:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:11.961 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f434e83-45c8-454d-820b-af39b696a1d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:04:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:11.962 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4b86de4b-ee0a-4da3-9608-36cc0597c19b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:11.963 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace which is not needed anymore#033[00m
Jan 20 10:04:12 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [NOTICE]   (284803) : haproxy version is 2.8.14-c23fe91
Jan 20 10:04:12 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [NOTICE]   (284803) : path to executable is /usr/sbin/haproxy
Jan 20 10:04:12 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [WARNING]  (284803) : Exiting Master process...
Jan 20 10:04:12 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [ALERT]    (284803) : Current worker (284805) exited with code 143 (Terminated)
Jan 20 10:04:12 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[284799]: [WARNING]  (284803) : All workers exited. Exiting... (0)
Jan 20 10:04:12 np0005588920 systemd[1]: libpod-bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5.scope: Deactivated successfully.
Jan 20 10:04:12 np0005588920 podman[285223]: 2026-01-20 15:04:12.096728863 +0000 UTC m=+0.044951764 container died bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 10:04:12 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5-userdata-shm.mount: Deactivated successfully.
Jan 20 10:04:12 np0005588920 systemd[1]: var-lib-containers-storage-overlay-3030b5d9e15e87195925e49258495c7afaaccaffb05a0f967add90fb4cab3b51-merged.mount: Deactivated successfully.
Jan 20 10:04:12 np0005588920 podman[285223]: 2026-01-20 15:04:12.133281987 +0000 UTC m=+0.081504888 container cleanup bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:04:12 np0005588920 systemd[1]: libpod-conmon-bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5.scope: Deactivated successfully.
Jan 20 10:04:12 np0005588920 podman[285251]: 2026-01-20 15:04:12.198949052 +0000 UTC m=+0.044417089 container remove bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.205 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d45439c3-9dff-4d29-8964-ddf716a22051]: (4, ('Tue Jan 20 03:04:12 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5)\nbda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5\nTue Jan 20 03:04:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (bda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5)\nbda9e9de862354a390c4dd623c69e028897d02b844889ca71617cabd3954a9d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.207 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39106440-368d-4449-bb63-75ef92b8fdd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.208 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:12 np0005588920 nova_compute[226886]: 2026-01-20 15:04:12.209 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:12 np0005588920 kernel: tap0f434e83-40: left promiscuous mode
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.228 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9485be-7380-425e-8bcb-caeefd6797e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 nova_compute[226886]: 2026-01-20 15:04:12.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.247 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[28f0b258-764b-476a-90c6-efa6237613ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.249 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bebba6c9-e085-477e-8b2b-f09b3dd269a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.265 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44d3e069-dccf-4b79-b1f9-403f151351c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646669, 'reachable_time': 43354, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285270, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 systemd[1]: run-netns-ovnmeta\x2d0f434e83\x2d45c8\x2d454d\x2d820b\x2daf39b696a1d5.mount: Deactivated successfully.
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.267 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:04:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:12.268 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9b1515-9e36-4283-997c-5959aef3ccb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:12.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:12 np0005588920 nova_compute[226886]: 2026-01-20 15:04:12.824 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:12 np0005588920 nova_compute[226886]: 2026-01-20 15:04:12.978 226890 INFO nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Beginning cold snapshot process#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.023 226890 DEBUG nova.compute.manager [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.023 226890 DEBUG oslo_concurrency.lockutils [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.023 226890 DEBUG oslo_concurrency.lockutils [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.023 226890 DEBUG oslo_concurrency.lockutils [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.024 226890 DEBUG nova.compute.manager [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.024 226890 WARNING nova.compute.manager [req-da6ee9aa-e6d9-4d01-9433-a765e6363edd req-edd3d1b2-af86-4246-98d9-c0b125e7b96f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received unexpected event network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with vm_state active and task_state shelving.#033[00m
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.262 226890 DEBUG nova.virt.libvirt.imagebackend [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 10:04:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:13.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:13 np0005588920 nova_compute[226886]: 2026-01-20 15:04:13.829 226890 DEBUG nova.storage.rbd_utils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] creating snapshot(f4dc1b0110394c469ddf3ab61c5e5900) on rbd image(a0ce16c6-2b75-472f-a785-890fbb0d748e_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:04:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:14.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 20 10:04:14 np0005588920 nova_compute[226886]: 2026-01-20 15:04:14.511 226890 DEBUG nova.storage.rbd_utils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] cloning vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk@f4dc1b0110394c469ddf3ab61c5e5900 to images/54adca73-9204-4335-8f5e-a77f60750fc4 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:04:14 np0005588920 nova_compute[226886]: 2026-01-20 15:04:14.635 226890 DEBUG nova.storage.rbd_utils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] flattening images/54adca73-9204-4335-8f5e-a77f60750fc4 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.072 226890 DEBUG nova.storage.rbd_utils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] removing snapshot(f4dc1b0110394c469ddf3ab61c5e5900) on rbd image(a0ce16c6-2b75-472f-a785-890fbb0d748e_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.184 226890 DEBUG nova.compute.manager [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.185 226890 DEBUG oslo_concurrency.lockutils [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.185 226890 DEBUG oslo_concurrency.lockutils [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.185 226890 DEBUG oslo_concurrency.lockutils [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.185 226890 DEBUG nova.compute.manager [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.186 226890 WARNING nova.compute.manager [req-90af8bfb-9505-4adc-b0fb-a4a960a36b9b req-89fd5d88-922e-4345-91a0-61e7e791d7cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received unexpected event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.242712) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455242783, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 430, "num_deletes": 250, "total_data_size": 493335, "memory_usage": 501440, "flush_reason": "Manual Compaction"}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455248357, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 278919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 57799, "largest_seqno": 58224, "table_properties": {"data_size": 276551, "index_size": 468, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6502, "raw_average_key_size": 20, "raw_value_size": 271740, "raw_average_value_size": 854, "num_data_blocks": 21, "num_entries": 318, "num_filter_entries": 318, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921437, "oldest_key_time": 1768921437, "file_creation_time": 1768921455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 5695 microseconds, and 1823 cpu microseconds.
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248406) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 278919 bytes OK
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248427) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.251005) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.251029) EVENT_LOG_v1 {"time_micros": 1768921455251021, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.251049) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 490627, prev total WAL file size 490627, number of live WAL files 2.
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.251633) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373535' seq:72057594037927935, type:22 .. '6D6772737461740032303036' seq:0, type:0; will stop at (end)
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(272KB)], [111(13MB)]
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455251707, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 14103782, "oldest_snapshot_seqno": -1}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 8339 keys, 10313183 bytes, temperature: kUnknown
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455370759, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10313183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10259854, "index_size": 31393, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20869, "raw_key_size": 216186, "raw_average_key_size": 25, "raw_value_size": 10113694, "raw_average_value_size": 1212, "num_data_blocks": 1224, "num_entries": 8339, "num_filter_entries": 8339, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.371041) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10313183 bytes
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.373684) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.4 rd, 86.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(87.5) write-amplify(37.0) OK, records in: 8848, records dropped: 509 output_compression: NoCompression
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.373719) EVENT_LOG_v1 {"time_micros": 1768921455373706, "job": 70, "event": "compaction_finished", "compaction_time_micros": 119138, "compaction_time_cpu_micros": 40285, "output_level": 6, "num_output_files": 1, "total_output_size": 10313183, "num_input_records": 8848, "num_output_records": 8339, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455373982, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455376463, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.251483) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.376565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.376570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.376572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.376574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:04:15.376575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:15.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 20 10:04:15 np0005588920 nova_compute[226886]: 2026-01-20 15:04:15.600 226890 DEBUG nova.storage.rbd_utils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] creating snapshot(snap) on rbd image(54adca73-9204-4335-8f5e-a77f60750fc4) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:04:15 np0005588920 systemd[1]: Starting dnf makecache...
Jan 20 10:04:15 np0005588920 podman[285416]: 2026-01-20 15:04:15.979053314 +0000 UTC m=+0.060756925 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:04:16 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:16Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:87:66 10.100.0.5
Jan 20 10:04:16 np0005588920 dnf[285417]: Metadata cache refreshed recently.
Jan 20 10:04:16 np0005588920 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 20 10:04:16 np0005588920 systemd[1]: Finished dnf makecache.
Jan 20 10:04:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:16.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:16.465 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:16.465 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:16.466 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:16 np0005588920 nova_compute[226886]: 2026-01-20 15:04:16.508 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 20 10:04:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:17.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:17 np0005588920 nova_compute[226886]: 2026-01-20 15:04:17.826 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:18.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:19.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:19 np0005588920 nova_compute[226886]: 2026-01-20 15:04:19.980 226890 INFO nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Snapshot image upload complete#033[00m
Jan 20 10:04:19 np0005588920 nova_compute[226886]: 2026-01-20 15:04:19.981 226890 DEBUG nova.compute.manager [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.030 226890 INFO nova.compute.manager [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Shelve offloading#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.037 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance destroyed successfully.#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.038 226890 DEBUG nova.compute.manager [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.040 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.041 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:20 np0005588920 nova_compute[226886]: 2026-01-20 15:04:20.041 226890 DEBUG nova.network.neutron [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:04:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:20.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:21.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:21 np0005588920 nova_compute[226886]: 2026-01-20 15:04:21.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:21Z|00738|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:04:21 np0005588920 nova_compute[226886]: 2026-01-20 15:04:21.607 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:22.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.465 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.465 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.466 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.466 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.466 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.468 226890 INFO nova.compute.manager [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Terminating instance#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.469 226890 DEBUG nova.compute.manager [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:04:22 np0005588920 kernel: tap27ba7c79-86 (unregistering): left promiscuous mode
Jan 20 10:04:22 np0005588920 NetworkManager[49076]: <info>  [1768921462.5177] device (tap27ba7c79-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:22Z|00739|binding|INFO|Releasing lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d from this chassis (sb_readonly=0)
Jan 20 10:04:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:22Z|00740|binding|INFO|Setting lport 27ba7c79-863a-4084-a5df-ee7a70ec6e0d down in Southbound
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.534 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:22Z|00741|binding|INFO|Removing iface tap27ba7c79-86 ovn-installed in OVS
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.537 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.548 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:87:66 10.100.0.5'], port_security=['fa:16:3e:3d:87:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75368220-ff38-456b-a0e6-ae1c02625514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9aa52617-8217-40d2-b2b6-31674dd65078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=27ba7c79-863a-4084-a5df-ee7a70ec6e0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.549 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 27ba7c79-863a-4084-a5df-ee7a70ec6e0d in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a unbound from our chassis#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.550 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.551 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1a1703-ca1f-4e63-b0ce-d19f161ec647]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.551 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace which is not needed anymore#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.553 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 20 10:04:22 np0005588920 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009b.scope: Consumed 14.124s CPU time.
Jan 20 10:04:22 np0005588920 systemd-machined[196121]: Machine qemu-74-instance-0000009b terminated.
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [NOTICE]   (285154) : haproxy version is 2.8.14-c23fe91
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [NOTICE]   (285154) : path to executable is /usr/sbin/haproxy
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [WARNING]  (285154) : Exiting Master process...
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [WARNING]  (285154) : Exiting Master process...
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [ALERT]    (285154) : Current worker (285156) exited with code 143 (Terminated)
Jan 20 10:04:22 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[285150]: [WARNING]  (285154) : All workers exited. Exiting... (0)
Jan 20 10:04:22 np0005588920 systemd[1]: libpod-e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7.scope: Deactivated successfully.
Jan 20 10:04:22 np0005588920 podman[285464]: 2026-01-20 15:04:22.679645977 +0000 UTC m=+0.043790631 container died e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:04:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.703 226890 INFO nova.virt.libvirt.driver [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Instance destroyed successfully.#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.704 226890 DEBUG nova.objects.instance [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'resources' on Instance uuid 75368220-ff38-456b-a0e6-ae1c02625514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:22 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:04:22 np0005588920 systemd[1]: var-lib-containers-storage-overlay-7974acc1c70817dbf985188f32174d4dcdcf0e5318a8ff7f03dd0a6fe7643be0-merged.mount: Deactivated successfully.
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.721 226890 DEBUG nova.virt.libvirt.vif [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-284183767',display_name='tempest-AttachVolumeTestJSON-server-284183767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-284183767',id=155,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZMMa9Fn48cT13jyLNVxZBqG2NAPc4g1Znb9IEN8J7OPuEySAWtPNC9EMH4uWUG8OO1N+YGXE5zrWJgSxgzur/4qS1UEfGQON2+xLOpRFvKfmgmBUr46iCGe8EkNjED6w==',key_name='tempest-keypair-988338540',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-qucgebjt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:04:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=75368220-ff38-456b-a0e6-ae1c02625514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.722 226890 DEBUG nova.network.os_vif_util [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "address": "fa:16:3e:3d:87:66", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27ba7c79-86", "ovs_interfaceid": "27ba7c79-863a-4084-a5df-ee7a70ec6e0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.723 226890 DEBUG nova.network.os_vif_util [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.724 226890 DEBUG os_vif [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:04:22 np0005588920 podman[285464]: 2026-01-20 15:04:22.724746825 +0000 UTC m=+0.088891469 container cleanup e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.725 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.726 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27ba7c79-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.728 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:04:22 np0005588920 systemd[1]: libpod-conmon-e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7.scope: Deactivated successfully.
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.733 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.736 226890 INFO os_vif [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:87:66,bridge_name='br-int',has_traffic_filtering=True,id=27ba7c79-863a-4084-a5df-ee7a70ec6e0d,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27ba7c79-86')#033[00m
Jan 20 10:04:22 np0005588920 podman[285509]: 2026-01-20 15:04:22.790998726 +0000 UTC m=+0.044433059 container remove e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.798 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f66bd670-c3aa-47b8-b2e4-a44af2947656]: (4, ('Tue Jan 20 03:04:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7)\ne6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7\nTue Jan 20 03:04:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (e6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7)\ne6bf006295fff8595b28f9c79e758887278bf3ac04a3429628ad98c15e4757d7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.800 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9499e361-02c8-4a28-ac0d-a89fb5515497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.801 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.802 226890 DEBUG nova.compute.manager [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.803 226890 DEBUG oslo_concurrency.lockutils [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:22 np0005588920 kernel: tap89fdd65f-30: left promiscuous mode
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.803 226890 DEBUG oslo_concurrency.lockutils [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.804 226890 DEBUG oslo_concurrency.lockutils [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.804 226890 DEBUG nova.compute.manager [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.805 226890 DEBUG nova.compute.manager [req-91c20727-eaaf-440d-bf19-d456bb3a7fdf req-5f2f64d0-554d-478b-899d-c38f7bae9efb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-unplugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.805 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 nova_compute[226886]: 2026-01-20 15:04:22.823 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.825 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b28ee86d-884a-4145-b30a-0025a3aa0a00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.841 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce7a9ac-6ed4-4c55-9c14-46d22f8666e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.843 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e86872db-c132-496b-a821-3a94c7aa6b8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.857 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1aea20a3-1369-40b1-8d74-e87fc5a896fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648583, 'reachable_time': 28860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285545, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.859 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:04:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:22.859 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[915a8cac-e273-4bfa-890a-c4cbd8f644e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:22 np0005588920 systemd[1]: run-netns-ovnmeta\x2d89fdd65f\x2d3dd2\x2d4375\x2da946\x2d3c5de73cc24a.mount: Deactivated successfully.
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.006 226890 DEBUG nova.network.neutron [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.072 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.134 226890 INFO nova.virt.libvirt.driver [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Deleting instance files /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514_del#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.134 226890 INFO nova.virt.libvirt.driver [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Deletion of /var/lib/nova/instances/75368220-ff38-456b-a0e6-ae1c02625514_del complete#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.232 226890 INFO nova.compute.manager [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.233 226890 DEBUG oslo.service.loopingcall [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.233 226890 DEBUG nova.compute.manager [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:04:23 np0005588920 nova_compute[226886]: 2026-01-20 15:04:23.233 226890 DEBUG nova.network.neutron [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:04:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:23.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:24.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.047 226890 DEBUG nova.compute.manager [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.048 226890 DEBUG oslo_concurrency.lockutils [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75368220-ff38-456b-a0e6-ae1c02625514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.048 226890 DEBUG oslo_concurrency.lockutils [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.048 226890 DEBUG oslo_concurrency.lockutils [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.048 226890 DEBUG nova.compute.manager [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] No waiting events found dispatching network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.048 226890 WARNING nova.compute.manager [req-e02663c9-af81-4474-be75-d48d2639f527 req-52ea5c92-fed1-46ac-9c26-00904457c944 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received unexpected event network-vif-plugged-27ba7c79-863a-4084-a5df-ee7a70ec6e0d for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:04:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:25.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.881 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance destroyed successfully.#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.881 226890 DEBUG nova.objects.instance [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'resources' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.912 226890 DEBUG nova.virt.libvirt.vif [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKbSg+U5D2P4vAhN93N9KUHNV5uhMaQWWRL1/dgo18CRR+13PC7EHc+NfhsO3rchRXZsX8fKAmtn1X9kzXWRANuFYEKLsCK/cad6C56A1ZIn2STxVc8j8348CriP8hVdg==',key_name='tempest-TestShelveInstance-624896822',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:03:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member',shelved_at='2026-01-20T15:04:19.981727',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='54adca73-9204-4335-8f5e-a77f60750fc4'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:04:13Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.913 226890 DEBUG nova.network.os_vif_util [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.913 226890 DEBUG nova.network.os_vif_util [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.914 226890 DEBUG os_vif [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.915 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.916 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b2aa669-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.919 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:25 np0005588920 nova_compute[226886]: 2026-01-20 15:04:25.922 226890 INFO os_vif [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f')#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.073 226890 DEBUG nova.network.neutron [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.101 226890 INFO nova.compute.manager [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Took 2.87 seconds to deallocate network for instance.#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.174 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.175 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.325 226890 INFO nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deleting instance files /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e_del#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.326 226890 INFO nova.virt.libvirt.driver [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deletion of /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e_del complete#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.333 226890 DEBUG oslo_concurrency.processutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.434 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921451.4326217, a0ce16c6-2b75-472f-a785-890fbb0d748e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.435 226890 INFO nova.compute.manager [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.479 226890 DEBUG nova.compute.manager [None req-bda43911-c13e-4432-a567-5bdcf5ab9e7d - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.509 226890 DEBUG nova.compute.manager [req-c59ca4ee-a335-4312-865e-f5ca283f6605 req-617df8c7-3415-47fb-a28f-c8c6fbf3eb5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Received event network-vif-deleted-27ba7c79-863a-4084-a5df-ee7a70ec6e0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.538 226890 INFO nova.scheduler.client.report [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Deleted allocations for instance a0ce16c6-2b75-472f-a785-890fbb0d748e#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.583 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3419700180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.854 226890 DEBUG oslo_concurrency.processutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.860 226890 DEBUG nova.compute.provider_tree [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.899 226890 DEBUG nova.scheduler.client.report [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.934 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.937 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:26 np0005588920 nova_compute[226886]: 2026-01-20 15:04:26.986 226890 DEBUG oslo_concurrency.processutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.015 226890 INFO nova.scheduler.client.report [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Deleted allocations for instance 75368220-ff38-456b-a0e6-ae1c02625514#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.128 226890 DEBUG oslo_concurrency.lockutils [None req-46352cb5-854a-4541-9066-b4808bbc9bfe 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "75368220-ff38-456b-a0e6-ae1c02625514" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.369 226890 DEBUG nova.compute.manager [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.369 226890 DEBUG nova.compute.manager [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing instance network info cache due to event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.370 226890 DEBUG oslo_concurrency.lockutils [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.370 226890 DEBUG oslo_concurrency.lockutils [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.371 226890 DEBUG nova.network.neutron [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:04:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/962284744' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.418 226890 DEBUG oslo_concurrency.processutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.424 226890 DEBUG nova.compute.provider_tree [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.444 226890 DEBUG nova.scheduler.client.report [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:27.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.471 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:27 np0005588920 nova_compute[226886]: 2026-01-20 15:04:27.594 226890 DEBUG oslo_concurrency.lockutils [None req-f1b03289-89ea-4cdf-a7fd-0d415ab12552 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:28.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:29.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:30.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:30 np0005588920 nova_compute[226886]: 2026-01-20 15:04:30.514 226890 DEBUG nova.network.neutron [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated VIF entry in instance network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:04:30 np0005588920 nova_compute[226886]: 2026-01-20 15:04:30.515 226890 DEBUG nova.network.neutron [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": null, "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:30 np0005588920 nova_compute[226886]: 2026-01-20 15:04:30.552 226890 DEBUG oslo_concurrency.lockutils [req-6fca2288-da5a-4f58-a613-1f6944fdbecc req-296b2429-b03b-434a-b6bb-6e4afbfa7864 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:30 np0005588920 nova_compute[226886]: 2026-01-20 15:04:30.918 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:31.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:31 np0005588920 nova_compute[226886]: 2026-01-20 15:04:31.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:32.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:04:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:04:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:33.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.290 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.290 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.291 226890 INFO nova.compute.manager [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Unshelving#033[00m
Jan 20 10:04:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:34.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.446 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.446 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.451 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'pci_requests' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.475 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'numa_topology' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.542 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.543 226890 INFO nova.compute.claims [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:04:34 np0005588920 nova_compute[226886]: 2026-01-20 15:04:34.698 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1897084' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.156 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.164 226890 DEBUG nova.compute.provider_tree [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.198 226890 DEBUG nova.scheduler.client.report [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.223 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:35.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.543 226890 INFO nova.network.neutron [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:04:35 np0005588920 nova_compute[226886]: 2026-01-20 15:04:35.949 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:36.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:36 np0005588920 nova_compute[226886]: 2026-01-20 15:04:36.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.174 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.175 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.175 226890 DEBUG nova.network.neutron [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.452 226890 DEBUG nova.compute.manager [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.453 226890 DEBUG nova.compute.manager [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing instance network info cache due to event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.453 226890 DEBUG oslo_concurrency.lockutils [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:37.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.700 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921462.6997058, 75368220-ff38-456b-a0e6-ae1c02625514 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.701 226890 INFO nova.compute.manager [-] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:04:37 np0005588920 nova_compute[226886]: 2026-01-20 15:04:37.732 226890 DEBUG nova.compute.manager [None req-9ad9dab8-0391-47dc-a4e1-fb1f996ee0f0 - - - - - -] [instance: 75368220-ff38-456b-a0e6-ae1c02625514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:37 np0005588920 podman[285764]: 2026-01-20 15:04:37.992928153 +0000 UTC m=+0.082011692 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:04:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:38.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:38.828 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:38 np0005588920 nova_compute[226886]: 2026-01-20 15:04:38.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:38.829 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:04:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:39 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:39.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:40.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.549 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.550 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.580 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.632 226890 DEBUG nova.network.neutron [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.655 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.656 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.657 226890 INFO nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating image(s)#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.683 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.688 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'trusted_certs' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.690 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.691 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.691 226890 DEBUG oslo_concurrency.lockutils [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.692 226890 DEBUG nova.network.neutron [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.699 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.700 226890 INFO nova.compute.claims [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.750 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.779 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.783 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a756df35f2019cee1da461910b4c21ef288b5691" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.784 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a756df35f2019cee1da461910b4c21ef288b5691" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.951 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:40 np0005588920 nova_compute[226886]: 2026-01-20 15:04:40.960 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.256 226890 DEBUG nova.virt.libvirt.imagebackend [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/54adca73-9204-4335-8f5e-a77f60750fc4/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/54adca73-9204-4335-8f5e-a77f60750fc4/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.319 226890 DEBUG nova.virt.libvirt.imagebackend [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/54adca73-9204-4335-8f5e-a77f60750fc4/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.320 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] cloning images/54adca73-9204-4335-8f5e-a77f60750fc4@snap to None/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:04:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3769637538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.410 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.417 226890 DEBUG nova.compute.provider_tree [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.441 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a756df35f2019cee1da461910b4c21ef288b5691" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:41.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.492 226890 DEBUG nova.scheduler.client.report [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.541 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.542 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.545 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.606 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'migration_context' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.612 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.613 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.677 226890 INFO nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.685 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] flattening vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.775 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.906 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.907 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.908 226890 INFO nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Creating image(s)#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.933 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:41 np0005588920 nova_compute[226886]: 2026-01-20 15:04:41.977 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.000 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.003 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.055 226890 DEBUG nova.policy [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '912329b1a6ad42bdb72e952c03983bdf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96f7b14c2a9348f08305fe232df2a603', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.084 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.085 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.085 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.086 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.112 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.116 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.293 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Image rbd:vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.294 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.295 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Ensure instance console log exists: /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.295 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.296 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.296 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.298 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start _get_guest_xml network_info=[{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:04:08Z,direct_url=<?>,disk_format='raw',id=54adca73-9204-4335-8f5e-a77f60750fc4,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1036076849-shelved',owner='0fc924d2df984301897e81920c5e192f',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:04:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.302 226890 WARNING nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.309 226890 DEBUG nova.virt.libvirt.host [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.309 226890 DEBUG nova.virt.libvirt.host [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.313 226890 DEBUG nova.virt.libvirt.host [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.313 226890 DEBUG nova.virt.libvirt.host [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.315 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.315 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T15:04:08Z,direct_url=<?>,disk_format='raw',id=54adca73-9204-4335-8f5e-a77f60750fc4,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1036076849-shelved',owner='0fc924d2df984301897e81920c5e192f',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T15:04:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.316 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.316 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.316 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.316 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.316 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.317 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.317 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.317 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.318 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.318 226890 DEBUG nova.virt.hardware [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.318 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'vcpu_model' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.336 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:42.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.453 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.545 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] resizing rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.679 226890 DEBUG nova.objects.instance [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'migration_context' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.704 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.705 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Ensure instance console log exists: /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.706 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.706 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.707 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2179692119' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.813 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:42.832 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.839 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:42 np0005588920 nova_compute[226886]: 2026-01-20 15:04:42.844 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:43 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3343341651' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.357 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.359 226890 DEBUG nova.virt.libvirt.vif [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='54adca73-9204-4335-8f5e-a77f60750fc4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-624896822',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:03:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member',shelved_at='2026-01-20T15:04:19.981727',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='54adca73-9204-4335-8f5e-a77f60750fc4'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:04:34Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.359 226890 DEBUG nova.network.os_vif_util [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.360 226890 DEBUG nova.network.os_vif_util [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.361 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'pci_devices' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.389 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <uuid>a0ce16c6-2b75-472f-a785-890fbb0d748e</uuid>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <name>instance-0000009e</name>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestShelveInstance-server-1036076849</nova:name>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:04:42</nova:creationTime>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:user uuid="b02a8ef6cc3946ceb2c8846aae2eae68">tempest-TestShelveInstance-1425544575-project-member</nova:user>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:project uuid="0fc924d2df984301897e81920c5e192f">tempest-TestShelveInstance-1425544575</nova:project>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="54adca73-9204-4335-8f5e-a77f60750fc4"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <nova:port uuid="7b2aa669-8f25-4d67-b56d-f9a96e1774a4">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="serial">a0ce16c6-2b75-472f-a785-890fbb0d748e</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="uuid">a0ce16c6-2b75-472f-a785-890fbb0d748e</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:f8:b3:b3"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <target dev="tap7b2aa669-8f"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/console.log" append="off"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <input type="keyboard" bus="usb"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:04:43 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:04:43 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:04:43 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:04:43 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.390 226890 DEBUG nova.compute.manager [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Preparing to wait for external event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.391 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.391 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.391 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.392 226890 DEBUG nova.virt.libvirt.vif [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='54adca73-9204-4335-8f5e-a77f60750fc4',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-624896822',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:03:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member',shelved_at='2026-01-20T15:04:19.981727',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='54adca73-9204-4335-8f5e-a77f60750fc4'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:04:34Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.392 226890 DEBUG nova.network.os_vif_util [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.393 226890 DEBUG nova.network.os_vif_util [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.393 226890 DEBUG os_vif [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.394 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.394 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.394 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.397 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.397 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b2aa669-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.398 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b2aa669-8f, col_values=(('external_ids', {'iface-id': '7b2aa669-8f25-4d67-b56d-f9a96e1774a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:b3:b3', 'vm-uuid': 'a0ce16c6-2b75-472f-a785-890fbb0d748e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:43 np0005588920 NetworkManager[49076]: <info>  [1768921483.4008] manager: (tap7b2aa669-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.402 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.406 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.407 226890 INFO os_vif [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f')#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.467 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.468 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.468 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No VIF found with MAC fa:16:3e:f8:b3:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.469 226890 INFO nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Using config drive#033[00m
Jan 20 10:04:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:43.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.492 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.521 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'ec2_ids' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.628 226890 DEBUG nova.objects.instance [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'keypairs' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.688 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Successfully created port: 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.875 226890 DEBUG nova.network.neutron [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated VIF entry in instance network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.875 226890 DEBUG nova.network.neutron [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:43 np0005588920 nova_compute[226886]: 2026-01-20 15:04:43.896 226890 DEBUG oslo_concurrency.lockutils [req-291b610b-892f-4482-96b0-346a53b3a62e req-972a45cf-3f3a-4403-be29-8a9ff0dea55d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.203 226890 INFO nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Creating config drive at /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.207 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp25mr2ci1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.359 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp25mr2ci1" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.387 226890 DEBUG nova.storage.rbd_utils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.391 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:44.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.551 226890 DEBUG oslo_concurrency.processutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config a0ce16c6-2b75-472f-a785-890fbb0d748e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.552 226890 INFO nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deleting local config drive /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e/disk.config because it was imported into RBD.#033[00m
Jan 20 10:04:44 np0005588920 kernel: tap7b2aa669-8f: entered promiscuous mode
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.6045] manager: (tap7b2aa669-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/359)
Jan 20 10:04:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:44Z|00742|binding|INFO|Claiming lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for this chassis.
Jan 20 10:04:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:44Z|00743|binding|INFO|7b2aa669-8f25-4d67-b56d-f9a96e1774a4: Claiming fa:16:3e:f8:b3:b3 10.100.0.12
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.605 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.611 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:b3:b3 10.100.0.12'], port_security=['fa:16:3e:f8:b3:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a0ce16c6-2b75-472f-a785-890fbb0d748e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '73ae63f6-3a5a-4604-9d46-53d9b9e08225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7b2aa669-8f25-4d67-b56d-f9a96e1774a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.612 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 bound to our chassis#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.613 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f434e83-45c8-454d-820b-af39b696a1d5#033[00m
Jan 20 10:04:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:44Z|00744|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 ovn-installed in OVS
Jan 20 10:04:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:44Z|00745|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 up in Southbound
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.623 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.624 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7484a184-acb6-4151-a311-18cf7c448209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.625 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f434e83-41 in ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.627 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.626 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f434e83-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.626 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dc47c3-e72e-40b0-bfdd-0686d534cee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.628 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[63c68349-b7a7-4f1c-b22b-41a95414d286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.639 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[fd95d33a-92c1-4777-8a92-c67fcce39c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 systemd-machined[196121]: New machine qemu-75-instance-0000009e.
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.655 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdea988-81e8-49ea-a562-b8c7110e18dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 systemd[1]: Started Virtual Machine qemu-75-instance-0000009e.
Jan 20 10:04:44 np0005588920 systemd-udevd[286383]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.6835] device (tap7b2aa669-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.6847] device (tap7b2aa669-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.685 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a4451139-6a54-4ad0-9636-26a4d88dfad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.691 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa9fabe-0b9f-4859-a486-7d7f25613819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.6918] manager: (tap0f434e83-40): new Veth device (/org/freedesktop/NetworkManager/Devices/360)
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.719 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc41f3f-30cf-41bd-95a8-fe3bdfc9b291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.722 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5aa1fb-7c7c-486c-8d4b-a552d66287c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.7475] device (tap0f434e83-40): carrier: link connected
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.753 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8889f3-1858-4ac3-ba1d-066223f6a630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.771 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[092aeb31-3c1a-4612-b8b6-b79b82081c37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 237], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652643, 'reachable_time': 41623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286412, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.817 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.829 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[086a656d-7946-4b4c-976b-eb04d168db92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:128d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 652643, 'tstamp': 652643}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286413, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.846 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78ca2045-7eef-4006-b57c-6df00a03a751]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 237], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652643, 'reachable_time': 41623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286414, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.883 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7c367772-7426-410e-94f6-9d164241904c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.943 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9496efaa-8d46-42b1-a690-d055c60db670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.945 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.945 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.946 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f434e83-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.948 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 NetworkManager[49076]: <info>  [1768921484.9491] manager: (tap0f434e83-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 20 10:04:44 np0005588920 kernel: tap0f434e83-40: entered promiscuous mode
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.952 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.955 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f434e83-40, col_values=(('external_ids', {'iface-id': '6133323e-bf50-4bbd-bc0b-9ecf135d8cd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:44Z|00746|binding|INFO|Releasing lport 6133323e-bf50-4bbd-bc0b-9ecf135d8cd5 from this chassis (sb_readonly=0)
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.958 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.960 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.962 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1ead2acc-25f6-4292-b08a-413b81c3e715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.963 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:04:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:44.964 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'env', 'PROCESS_TAG=haproxy-0f434e83-45c8-454d-820b-af39b696a1d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f434e83-45c8-454d-820b-af39b696a1d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.993 226890 DEBUG nova.compute.manager [req-595cc2f6-4d8d-40c9-b92c-fe891e119c91 req-b59a09bf-65f0-4f5b-ae4f-7f040be565f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.993 226890 DEBUG oslo_concurrency.lockutils [req-595cc2f6-4d8d-40c9-b92c-fe891e119c91 req-b59a09bf-65f0-4f5b-ae4f-7f040be565f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.993 226890 DEBUG oslo_concurrency.lockutils [req-595cc2f6-4d8d-40c9-b92c-fe891e119c91 req-b59a09bf-65f0-4f5b-ae4f-7f040be565f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.994 226890 DEBUG oslo_concurrency.lockutils [req-595cc2f6-4d8d-40c9-b92c-fe891e119c91 req-b59a09bf-65f0-4f5b-ae4f-7f040be565f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:44 np0005588920 nova_compute[226886]: 2026-01-20 15:04:44.994 226890 DEBUG nova.compute.manager [req-595cc2f6-4d8d-40c9-b92c-fe891e119c91 req-b59a09bf-65f0-4f5b-ae4f-7f040be565f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Processing event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.020 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Successfully updated port: 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.042 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.042 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquired lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.043 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.233 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921485.233096, a0ce16c6-2b75-472f-a785-890fbb0d748e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.234 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.236 226890 DEBUG nova.compute.manager [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.239 226890 DEBUG nova.virt.libvirt.driver [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.241 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance spawned successfully.#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.270 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.274 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.302 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.302 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921485.2342358, a0ce16c6-2b75-472f-a785-890fbb0d748e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.303 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:04:45 np0005588920 podman[286489]: 2026-01-20 15:04:45.343138763 +0000 UTC m=+0.046911500 container create a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.348 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.354 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921485.2381399, a0ce16c6-2b75-472f-a785-890fbb0d748e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.354 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.379 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.383 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:45 np0005588920 systemd[1]: Started libpod-conmon-a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95.scope.
Jan 20 10:04:45 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:04:45 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:04:45 np0005588920 podman[286489]: 2026-01-20 15:04:45.317144931 +0000 UTC m=+0.020917508 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:04:45 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:04:45 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbf459ea8215d305a748ef1d595e846fc039db812977366cc0f31b4e2093f26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.430 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:04:45 np0005588920 podman[286489]: 2026-01-20 15:04:45.442110579 +0000 UTC m=+0.145883156 container init a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:04:45 np0005588920 podman[286489]: 2026-01-20 15:04:45.447738869 +0000 UTC m=+0.151511436 container start a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:04:45 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [NOTICE]   (286509) : New worker (286511) forked
Jan 20 10:04:45 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [NOTICE]   (286509) : Loading success.
Jan 20 10:04:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:45.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:45 np0005588920 nova_compute[226886]: 2026-01-20 15:04:45.521 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:04:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:46.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 20 10:04:46 np0005588920 nova_compute[226886]: 2026-01-20 15:04:46.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:46 np0005588920 nova_compute[226886]: 2026-01-20 15:04:46.795 226890 DEBUG nova.compute.manager [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-changed-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:46 np0005588920 nova_compute[226886]: 2026-01-20 15:04:46.796 226890 DEBUG nova.compute.manager [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Refreshing instance network info cache due to event network-changed-8c5a8745-5b2f-47c8-9968-acd29a3f46c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:04:46 np0005588920 nova_compute[226886]: 2026-01-20 15:04:46.797 226890 DEBUG oslo_concurrency.lockutils [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:46 np0005588920 podman[286520]: 2026-01-20 15:04:46.966146615 +0000 UTC m=+0.051504311 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.274 226890 DEBUG nova.compute.manager [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.425 226890 DEBUG nova.compute.manager [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.425 226890 DEBUG oslo_concurrency.lockutils [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.425 226890 DEBUG oslo_concurrency.lockutils [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.425 226890 DEBUG oslo_concurrency.lockutils [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.426 226890 DEBUG nova.compute.manager [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.426 226890 WARNING nova.compute.manager [req-62cd1e83-0ca3-4ab0-8603-3915d6ec43c7 req-1c07ea58-ad77-40db-b28e-b8a0926867e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received unexpected event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 20 10:04:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:47.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.543 226890 DEBUG oslo_concurrency.lockutils [None req-e50c3e12-d717-49ae-916c-2cbc2ec2ed6c b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 13.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.939 226890 DEBUG nova.network.neutron [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating instance_info_cache with network_info: [{"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.969 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Releasing lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.970 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Instance network_info: |[{"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.970 226890 DEBUG oslo_concurrency.lockutils [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.970 226890 DEBUG nova.network.neutron [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Refreshing network info cache for port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.973 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Start _get_guest_xml network_info=[{"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.977 226890 WARNING nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.984 226890 DEBUG nova.virt.libvirt.host [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:04:47 np0005588920 nova_compute[226886]: 2026-01-20 15:04:47.985 226890 DEBUG nova.virt.libvirt.host [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.001 226890 DEBUG nova.virt.libvirt.host [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.002 226890 DEBUG nova.virt.libvirt.host [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.003 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.004 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.004 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.004 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.005 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.005 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.005 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.006 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.006 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.006 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.007 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.007 226890 DEBUG nova.virt.hardware [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.010 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:48.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/259831174' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.510 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.538 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:48 np0005588920 nova_compute[226886]: 2026-01-20 15:04:48.542 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2211886453' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.008 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.011 226890 DEBUG nova.virt.libvirt.vif [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2146772163',display_name='tempest-AttachVolumeTestJSON-server-2146772163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2146772163',id=160,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMShu/UKrpC8IxRbXkByseIIDJT578k3TS0wOkHyBL1Nfel3atiUiXbZZQd23fr6BcQS57L5ztA9MT+neK/RSmXp3/2MHpk0f5u9h29ogwqYigXBQGeq9oHbFQrdd9SSSQ==',key_name='tempest-keypair-865842409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-cadrdpy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:04:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=3e59fbab-2129-45b7-8fb1-997b2ccede64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.012 226890 DEBUG nova.network.os_vif_util [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.013 226890 DEBUG nova.network.os_vif_util [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.015 226890 DEBUG nova.objects.instance [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.056 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <uuid>3e59fbab-2129-45b7-8fb1-997b2ccede64</uuid>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <name>instance-000000a0</name>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeTestJSON-server-2146772163</nova:name>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:04:47</nova:creationTime>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:user uuid="912329b1a6ad42bdb72e952c03983bdf">tempest-AttachVolumeTestJSON-583320363-project-member</nova:user>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:project uuid="96f7b14c2a9348f08305fe232df2a603">tempest-AttachVolumeTestJSON-583320363</nova:project>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <nova:port uuid="8c5a8745-5b2f-47c8-9968-acd29a3f46c6">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="serial">3e59fbab-2129-45b7-8fb1-997b2ccede64</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="uuid">3e59fbab-2129-45b7-8fb1-997b2ccede64</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/3e59fbab-2129-45b7-8fb1-997b2ccede64_disk">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:98:e3:f6"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <target dev="tap8c5a8745-5b"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/console.log" append="off"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:04:49 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:04:49 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:04:49 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:04:49 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.069 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Preparing to wait for external event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.070 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.070 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.071 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.072 226890 DEBUG nova.virt.libvirt.vif [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2146772163',display_name='tempest-AttachVolumeTestJSON-server-2146772163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2146772163',id=160,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMShu/UKrpC8IxRbXkByseIIDJT578k3TS0wOkHyBL1Nfel3atiUiXbZZQd23fr6BcQS57L5ztA9MT+neK/RSmXp3/2MHpk0f5u9h29ogwqYigXBQGeq9oHbFQrdd9SSSQ==',key_name='tempest-keypair-865842409',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-cadrdpy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:04:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=3e59fbab-2129-45b7-8fb1-997b2ccede64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.073 226890 DEBUG nova.network.os_vif_util [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.073 226890 DEBUG nova.network.os_vif_util [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.074 226890 DEBUG os_vif [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.076 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.077 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.081 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c5a8745-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.082 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c5a8745-5b, col_values=(('external_ids', {'iface-id': '8c5a8745-5b2f-47c8-9968-acd29a3f46c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:e3:f6', 'vm-uuid': '3e59fbab-2129-45b7-8fb1-997b2ccede64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.084 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588920 NetworkManager[49076]: <info>  [1768921489.0859] manager: (tap8c5a8745-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.091 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.092 226890 INFO os_vif [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b')#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.304 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.316 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.316 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No VIF found with MAC fa:16:3e:98:e3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.323 226890 INFO nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Using config drive#033[00m
Jan 20 10:04:49 np0005588920 nova_compute[226886]: 2026-01-20 15:04:49.349 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:49.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.208 226890 INFO nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Creating config drive at /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.214 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb3exvm91 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.355 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb3exvm91" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.387 226890 DEBUG nova.storage.rbd_utils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] rbd image 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.391 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:50.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.533 226890 DEBUG oslo_concurrency.processutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config 3e59fbab-2129-45b7-8fb1-997b2ccede64_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.534 226890 INFO nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Deleting local config drive /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64/disk.config because it was imported into RBD.#033[00m
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.5778] manager: (tap8c5a8745-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Jan 20 10:04:50 np0005588920 kernel: tap8c5a8745-5b: entered promiscuous mode
Jan 20 10:04:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:50Z|00747|binding|INFO|Claiming lport 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 for this chassis.
Jan 20 10:04:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:50Z|00748|binding|INFO|8c5a8745-5b2f-47c8-9968-acd29a3f46c6: Claiming fa:16:3e:98:e3:f6 10.100.0.13
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.583 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.600 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:e3:f6 10.100.0.13'], port_security=['fa:16:3e:98:e3:f6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e59fbab-2129-45b7-8fb1-997b2ccede64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '14a85866-1e42-4f6c-80fa-7b6fb27c4433', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=8c5a8745-5b2f-47c8-9968-acd29a3f46c6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.602 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a bound to our chassis#033[00m
Jan 20 10:04:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:50Z|00749|binding|INFO|Setting lport 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 ovn-installed in OVS
Jan 20 10:04:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:50Z|00750|binding|INFO|Setting lport 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 up in Southbound
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.603 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.617 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39ba7ed7-d9a9-4e2c-a6b1-e679c2cf38a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.619 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89fdd65f-31 in ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.620 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89fdd65f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.621 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f84c6de6-803c-4f5f-86a2-2ac50bacbaa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.621 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8a488ae9-65ea-4804-afa3-d213f9b008e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 systemd-udevd[286677]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:50 np0005588920 systemd-machined[196121]: New machine qemu-76-instance-000000a0.
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.632 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bf7b56-43cc-4ffa-9ba0-393de1845834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.6448] device (tap8c5a8745-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.6454] device (tap8c5a8745-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:50 np0005588920 systemd[1]: Started Virtual Machine qemu-76-instance-000000a0.
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.652 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b4958123-3a7a-4c15-9a58-b4c07222135d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.685 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bd883828-017c-42fa-b0d0-575492e2dfd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.690 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5f668c51-f4a2-4a51-a165-96d73365f549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 systemd-udevd[286680]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.6924] manager: (tap89fdd65f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.731 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[78dc3993-4bec-47ae-83fb-0652eef5885b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.735 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e41ddc28-aefe-4349-9653-8b25ce3a09df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.7649] device (tap89fdd65f-30): carrier: link connected
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.778 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d12dd8af-5e37-49cf-a392-5cfcab227fc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.786 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.786 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.787 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.799 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39950bc6-8913-4e35-93bd-47d996f3394e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653245, 'reachable_time': 19188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286708, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.819 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[90ea798c-4a24-48ff-a6eb-04978da258e3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:d33d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 653245, 'tstamp': 653245}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286709, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.838 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[be93e18b-bbff-470b-ad4f-7495ec1ff8d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89fdd65f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:d3:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653245, 'reachable_time': 19188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286710, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.853 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.879 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3827c455-2cb6-456a-aaa3-ed4c5eee8e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.958 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6c518f-9dea-4518-97cb-40ebc67c88bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.959 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.960 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.960 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89fdd65f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588920 kernel: tap89fdd65f-30: entered promiscuous mode
Jan 20 10:04:50 np0005588920 NetworkManager[49076]: <info>  [1768921490.9631] manager: (tap89fdd65f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.965 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.967 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89fdd65f-30, col_values=(('external_ids', {'iface-id': '58f1013f-2d8d-46a7-97e6-2062537e7f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:50Z|00751|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.969 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.970 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.971 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8a20da-739f-4702-b5f3-5271e906253e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.972 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/89fdd65f-3dd2-4375-a946-3c5de73cc24a.pid.haproxy
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 89fdd65f-3dd2-4375-a946-3c5de73cc24a
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:04:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:04:50.974 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'env', 'PROCESS_TAG=haproxy-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89fdd65f-3dd2-4375-a946-3c5de73cc24a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:04:50 np0005588920 nova_compute[226886]: 2026-01-20 15:04:50.989 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.088 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.089 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.089 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.089 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.106 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921491.1056647, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.106 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.192 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.199 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921491.1059878, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.199 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.248 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.250 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.272 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:04:51 np0005588920 podman[286784]: 2026-01-20 15:04:51.339316898 +0000 UTC m=+0.044232443 container create 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:04:51 np0005588920 systemd[1]: Started libpod-conmon-21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9.scope.
Jan 20 10:04:51 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:04:51 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad86864a5a531ad1de5aa2b8828511b47f464a558e851c605ee03237d7919f26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:04:51 np0005588920 podman[286784]: 2026-01-20 15:04:51.31554823 +0000 UTC m=+0.020463795 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:04:51 np0005588920 podman[286784]: 2026-01-20 15:04:51.414180986 +0000 UTC m=+0.119096561 container init 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:04:51 np0005588920 podman[286784]: 2026-01-20 15:04:51.419661532 +0000 UTC m=+0.124577077 container start 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:04:51 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [NOTICE]   (286803) : New worker (286805) forked
Jan 20 10:04:51 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [NOTICE]   (286803) : Loading success.
Jan 20 10:04:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:51.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:51 np0005588920 nova_compute[226886]: 2026-01-20 15:04:51.521 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:52 np0005588920 nova_compute[226886]: 2026-01-20 15:04:52.038 226890 DEBUG nova.network.neutron [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updated VIF entry in instance network info cache for port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:04:52 np0005588920 nova_compute[226886]: 2026-01-20 15:04:52.039 226890 DEBUG nova.network.neutron [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating instance_info_cache with network_info: [{"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:52 np0005588920 nova_compute[226886]: 2026-01-20 15:04:52.078 226890 DEBUG oslo_concurrency.lockutils [req-4fb8ecaa-528f-49f2-b3ad-fc6025d2d459 req-43c2952b-a8cc-4969-9844-ecb74f861bcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:52.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:53.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.546 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.666 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.667 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.667 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:53 np0005588920 nova_compute[226886]: 2026-01-20 15:04:53.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.084 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:54.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.867 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.868 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.868 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.869 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:04:54 np0005588920 nova_compute[226886]: 2026-01-20 15:04:54.869 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:55 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/943230783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.311 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.328 226890 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.329 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.330 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.330 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.330 226890 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Processing event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.331 226890 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.331 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.331 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.332 226890 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.332 226890 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] No waiting events found dispatching network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.332 226890 WARNING nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received unexpected event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.333 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.338 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921495.3386643, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.339 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.342 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.350 226890 INFO nova.virt.libvirt.driver [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Instance spawned successfully.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.351 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.403 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.409 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.415 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.416 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.416 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.417 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.418 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.419 226890 DEBUG nova.virt.libvirt.driver [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:55 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.502 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:55.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.502 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.506 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.507 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.534 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.559 226890 INFO nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Took 13.65 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.560 226890 DEBUG nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.663 226890 INFO nova.compute.manager [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Took 15.02 seconds to build instance.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.697 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.698 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3968MB free_disk=20.830379486083984GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.699 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.699 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:55 np0005588920 nova_compute[226886]: 2026-01-20 15:04:55.915 226890 DEBUG oslo_concurrency.lockutils [None req-954fc60b-cd4e-4be4-be52-829d03e43cdb 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.365 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance a0ce16c6-2b75-472f-a785-890fbb0d748e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.366 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.366 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.367 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:04:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:56.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:56 np0005588920 nova_compute[226886]: 2026-01-20 15:04:56.525 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3976258873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:57 np0005588920 nova_compute[226886]: 2026-01-20 15:04:57.004 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:57 np0005588920 nova_compute[226886]: 2026-01-20 15:04:57.011 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:57 np0005588920 nova_compute[226886]: 2026-01-20 15:04:57.050 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:57 np0005588920 nova_compute[226886]: 2026-01-20 15:04:57.128 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:04:57 np0005588920 nova_compute[226886]: 2026-01-20 15:04:57.129 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:57.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:57 np0005588920 ovn_controller[133971]: 2026-01-20T15:04:57Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:b3:b3 10.100.0.12
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.129 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.130 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.131 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.131 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.132 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.132 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:04:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:58.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:58 np0005588920 nova_compute[226886]: 2026-01-20 15:04:58.747 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:59 np0005588920 nova_compute[226886]: 2026-01-20 15:04:59.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:04:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:59.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 20 10:05:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:00.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:00 np0005588920 nova_compute[226886]: 2026-01-20 15:05:00.723 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:01.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:01 np0005588920 nova_compute[226886]: 2026-01-20 15:05:01.523 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:02.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:02 np0005588920 nova_compute[226886]: 2026-01-20 15:05:02.449 226890 DEBUG nova.compute.manager [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-changed-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:02 np0005588920 nova_compute[226886]: 2026-01-20 15:05:02.450 226890 DEBUG nova.compute.manager [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Refreshing instance network info cache due to event network-changed-8c5a8745-5b2f-47c8-9968-acd29a3f46c6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:05:02 np0005588920 nova_compute[226886]: 2026-01-20 15:05:02.450 226890 DEBUG oslo_concurrency.lockutils [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:02 np0005588920 nova_compute[226886]: 2026-01-20 15:05:02.450 226890 DEBUG oslo_concurrency.lockutils [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:02 np0005588920 nova_compute[226886]: 2026-01-20 15:05:02.450 226890 DEBUG nova.network.neutron [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Refreshing network info cache for port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:05:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:03.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:04 np0005588920 nova_compute[226886]: 2026-01-20 15:05:04.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:04.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:05.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:06 np0005588920 nova_compute[226886]: 2026-01-20 15:05:06.072 226890 DEBUG nova.network.neutron [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updated VIF entry in instance network info cache for port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:05:06 np0005588920 nova_compute[226886]: 2026-01-20 15:05:06.072 226890 DEBUG nova.network.neutron [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating instance_info_cache with network_info: [{"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:06 np0005588920 nova_compute[226886]: 2026-01-20 15:05:06.196 226890 DEBUG oslo_concurrency.lockutils [req-ef8a7e66-e9f0-4fc1-9ce7-3da4bcc8ec23 req-2ffd9b01-efce-4db0-951c-d51407d7da1d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3e59fbab-2129-45b7-8fb1-997b2ccede64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:06.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:06 np0005588920 nova_compute[226886]: 2026-01-20 15:05:06.526 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.505 226890 DEBUG nova.compute.manager [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.506 226890 DEBUG nova.compute.manager [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing instance network info cache due to event network-changed-7b2aa669-8f25-4d67-b56d-f9a96e1774a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.506 226890 DEBUG oslo_concurrency.lockutils [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.506 226890 DEBUG oslo_concurrency.lockutils [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.506 226890 DEBUG nova.network.neutron [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Refreshing network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:05:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:07.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.630 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.631 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.631 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.631 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.632 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.633 226890 INFO nova.compute.manager [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Terminating instance#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.634 226890 DEBUG nova.compute.manager [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:05:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:07 np0005588920 kernel: tap7b2aa669-8f (unregistering): left promiscuous mode
Jan 20 10:05:07 np0005588920 NetworkManager[49076]: <info>  [1768921507.9152] device (tap7b2aa669-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:05:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:07Z|00752|binding|INFO|Releasing lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 from this chassis (sb_readonly=0)
Jan 20 10:05:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:07Z|00753|binding|INFO|Setting lport 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 down in Southbound
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.925 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:07Z|00754|binding|INFO|Removing iface tap7b2aa669-8f ovn-installed in OVS
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588920 nova_compute[226886]: 2026-01-20 15:05:07.943 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588920 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 20 10:05:07 np0005588920 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009e.scope: Consumed 14.135s CPU time.
Jan 20 10:05:07 np0005588920 systemd-machined[196121]: Machine qemu-75-instance-0000009e terminated.
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.061 226890 INFO nova.virt.libvirt.driver [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Instance destroyed successfully.#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.062 226890 DEBUG nova.objects.instance [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'resources' on Instance uuid a0ce16c6-2b75-472f-a785-890fbb0d748e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:08.152 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:b3:b3 10.100.0.12'], port_security=['fa:16:3e:f8:b3:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a0ce16c6-2b75-472f-a785-890fbb0d748e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '9', 'neutron:security_group_ids': '73ae63f6-3a5a-4604-9d46-53d9b9e08225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=7b2aa669-8f25-4d67-b56d-f9a96e1774a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:08.153 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 unbound from our chassis#033[00m
Jan 20 10:05:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:08.155 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f434e83-45c8-454d-820b-af39b696a1d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:05:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:08.156 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e811d47-e3a6-4e47-a834-0fc46f4a9310]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:08.157 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace which is not needed anymore#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.180 226890 DEBUG nova.virt.libvirt.vif [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:03:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1036076849',display_name='tempest-TestShelveInstance-server-1036076849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1036076849',id=158,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKbSg+U5D2P4vAhN93N9KUHNV5uhMaQWWRL1/dgo18CRR+13PC7EHc+NfhsO3rchRXZsX8fKAmtn1X9kzXWRANuFYEKLsCK/cad6C56A1ZIn2STxVc8j8348CriP8hVdg==',key_name='tempest-TestShelveInstance-624896822',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:04:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18lfln0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:04:47Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=a0ce16c6-2b75-472f-a785-890fbb0d748e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.180 226890 DEBUG nova.network.os_vif_util [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.181 226890 DEBUG nova.network.os_vif_util [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.181 226890 DEBUG os_vif [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.182 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.183 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b2aa669-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.185 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.188 226890 INFO os_vif [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:b3:b3,bridge_name='br-int',has_traffic_filtering=True,id=7b2aa669-8f25-4d67-b56d-f9a96e1774a4,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b2aa669-8f')#033[00m
Jan 20 10:05:08 np0005588920 podman[286876]: 2026-01-20 15:05:08.208667467 +0000 UTC m=+0.106667667 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:05:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:08.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.661 226890 DEBUG nova.compute.manager [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.661 226890 DEBUG oslo_concurrency.lockutils [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.661 226890 DEBUG oslo_concurrency.lockutils [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.662 226890 DEBUG oslo_concurrency.lockutils [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.662 226890 DEBUG nova.compute.manager [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:08 np0005588920 nova_compute[226886]: 2026-01-20 15:05:08.662 226890 DEBUG nova.compute.manager [req-3fed6811-1864-4a7d-87c9-0665111ef62d req-80d87780-b1c8-4b0f-9104-c9dd79a03a0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-unplugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:05:08 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [NOTICE]   (286509) : haproxy version is 2.8.14-c23fe91
Jan 20 10:05:08 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [NOTICE]   (286509) : path to executable is /usr/sbin/haproxy
Jan 20 10:05:08 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [WARNING]  (286509) : Exiting Master process...
Jan 20 10:05:08 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [ALERT]    (286509) : Current worker (286511) exited with code 143 (Terminated)
Jan 20 10:05:08 np0005588920 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[286504]: [WARNING]  (286509) : All workers exited. Exiting... (0)
Jan 20 10:05:08 np0005588920 systemd[1]: libpod-a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95.scope: Deactivated successfully.
Jan 20 10:05:08 np0005588920 podman[286934]: 2026-01-20 15:05:08.689951656 +0000 UTC m=+0.436353978 container died a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:05:08 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95-userdata-shm.mount: Deactivated successfully.
Jan 20 10:05:08 np0005588920 systemd[1]: var-lib-containers-storage-overlay-3dbf459ea8215d305a748ef1d595e846fc039db812977366cc0f31b4e2093f26-merged.mount: Deactivated successfully.
Jan 20 10:05:08 np0005588920 podman[286934]: 2026-01-20 15:05:08.827140393 +0000 UTC m=+0.573542715 container cleanup a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:05:08 np0005588920 systemd[1]: libpod-conmon-a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95.scope: Deactivated successfully.
Jan 20 10:05:09 np0005588920 podman[286969]: 2026-01-20 15:05:09.054468502 +0000 UTC m=+0.204528630 container remove a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.060 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[507de366-3568-4d78-ba38-d2dd5d8456e8]: (4, ('Tue Jan 20 03:05:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95)\na0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95\nTue Jan 20 03:05:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (a0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95)\na0ad9a2bb38b8a92874cfa9e217ad93f9f590ce6cc0572aaacb22f4f27a95e95\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.062 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68310515-5424-4926-ae8a-88b964229f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.064 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.067 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:09 np0005588920 kernel: tap0f434e83-40: left promiscuous mode
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.086 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[53dde809-aa94-482d-bc92-468bf1e52906]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.096 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[24d4f939-7191-41b2-99a2-24fd2aa3e4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.098 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2cec21ee-00c6-450d-b00e-24b07c3d2e58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.115 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4d56db99-64a9-4888-ae88-1cac68687773]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652636, 'reachable_time': 19561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286986, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 systemd[1]: run-netns-ovnmeta\x2d0f434e83\x2d45c8\x2d454d\x2d820b\x2daf39b696a1d5.mount: Deactivated successfully.
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.119 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:05:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:09.119 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[dad0c4cd-71dc-452f-89aa-8b17bc763f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:09.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.839 226890 INFO nova.virt.libvirt.driver [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deleting instance files /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e_del#033[00m
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.840 226890 INFO nova.virt.libvirt.driver [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deletion of /var/lib/nova/instances/a0ce16c6-2b75-472f-a785-890fbb0d748e_del complete#033[00m
Jan 20 10:05:09 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:09Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:e3:f6 10.100.0.13
Jan 20 10:05:09 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:09Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:e3:f6 10.100.0.13
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.907 226890 INFO nova.compute.manager [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Took 2.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.908 226890 DEBUG oslo.service.loopingcall [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.908 226890 DEBUG nova.compute.manager [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:05:09 np0005588920 nova_compute[226886]: 2026-01-20 15:05:09.908 226890 DEBUG nova.network.neutron [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.280 226890 DEBUG nova.network.neutron [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updated VIF entry in instance network info cache for port 7b2aa669-8f25-4d67-b56d-f9a96e1774a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.280 226890 DEBUG nova.network.neutron [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [{"id": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "address": "fa:16:3e:f8:b3:b3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b2aa669-8f", "ovs_interfaceid": "7b2aa669-8f25-4d67-b56d-f9a96e1774a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.329 226890 DEBUG oslo_concurrency.lockutils [req-f753461a-c8d3-4482-b435-7c8824087068 req-21622176-3905-47a8-88d6-c4b76c0d7957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a0ce16c6-2b75-472f-a785-890fbb0d748e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:10.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.797 226890 DEBUG nova.compute.manager [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.798 226890 DEBUG oslo_concurrency.lockutils [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.798 226890 DEBUG oslo_concurrency.lockutils [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.798 226890 DEBUG oslo_concurrency.lockutils [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.799 226890 DEBUG nova.compute.manager [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] No waiting events found dispatching network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:10 np0005588920 nova_compute[226886]: 2026-01-20 15:05:10.799 226890 WARNING nova.compute.manager [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received unexpected event network-vif-plugged-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.118 226890 DEBUG nova.network.neutron [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.261 226890 INFO nova.compute.manager [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Took 1.35 seconds to deallocate network for instance.#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.321 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.322 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.447 226890 DEBUG oslo_concurrency.processutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:11.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.528 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2837469865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.896 226890 DEBUG oslo_concurrency.processutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.902 226890 DEBUG nova.compute.provider_tree [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:11 np0005588920 nova_compute[226886]: 2026-01-20 15:05:11.959 226890 DEBUG nova.scheduler.client.report [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:12 np0005588920 nova_compute[226886]: 2026-01-20 15:05:12.014 226890 DEBUG nova.compute.manager [req-54e9d5fb-f8cc-4504-8837-54ec7c69f65e req-60d532cf-daaa-4023-a10a-87e4c147a419 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Received event network-vif-deleted-7b2aa669-8f25-4d67-b56d-f9a96e1774a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:12 np0005588920 nova_compute[226886]: 2026-01-20 15:05:12.016 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:12 np0005588920 nova_compute[226886]: 2026-01-20 15:05:12.092 226890 INFO nova.scheduler.client.report [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Deleted allocations for instance a0ce16c6-2b75-472f-a785-890fbb0d748e#033[00m
Jan 20 10:05:12 np0005588920 nova_compute[226886]: 2026-01-20 15:05:12.323 226890 DEBUG oslo_concurrency.lockutils [None req-d7922561-ca89-4a1f-a739-c6a057fad098 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "a0ce16c6-2b75-472f-a785-890fbb0d748e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:12.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:13 np0005588920 nova_compute[226886]: 2026-01-20 15:05:13.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:13.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:05:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1230677250' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:05:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:05:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1230677250' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:05:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 20 10:05:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:14.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:15.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:16.465 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:16.466 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:16.466 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:16.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:16 np0005588920 nova_compute[226886]: 2026-01-20 15:05:16.565 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:17.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:17 np0005588920 podman[287010]: 2026-01-20 15:05:17.956539862 +0000 UTC m=+0.048089114 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 10:05:18 np0005588920 nova_compute[226886]: 2026-01-20 15:05:18.187 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:18.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:19.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:19 np0005588920 nova_compute[226886]: 2026-01-20 15:05:19.932 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:19 np0005588920 nova_compute[226886]: 2026-01-20 15:05:19.932 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:19 np0005588920 nova_compute[226886]: 2026-01-20 15:05:19.960 226890 DEBUG nova.objects.instance [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.098 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:20.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.532 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.532 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.533 226890 INFO nova.compute.manager [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attaching volume d93c7cb1-0f58-41fe-b846-7d4d85545927 to /dev/vdb#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.796 226890 DEBUG os_brick.utils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.797 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.809 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.810 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[2f019919-fd3d-4179-a3e8-f06881b8e6e0]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.812 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.821 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.821 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4a1d32-779d-44dc-8759-1fb8df56d906]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.823 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.832 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.832 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7830fe-bf62-4793-a6e7-2bcf7e1b8e27]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.833 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f58425-f2c9-48a7-b6de-39aafc24cd6c]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.834 226890 DEBUG oslo_concurrency.processutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.866 226890 DEBUG oslo_concurrency.processutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.868 226890 DEBUG os_brick.initiator.connectors.lightos [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.868 226890 DEBUG os_brick.initiator.connectors.lightos [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.869 226890 DEBUG os_brick.initiator.connectors.lightos [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.869 226890 DEBUG os_brick.utils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] <== get_connector_properties: return (72ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:05:20 np0005588920 nova_compute[226886]: 2026-01-20 15:05:20.869 226890 DEBUG nova.virt.block_device [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating existing volume attachment record: de3ef725-be5d-49bc-982b-c687c272bd73 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:05:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:21.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:21 np0005588920 nova_compute[226886]: 2026-01-20 15:05:21.568 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:21 np0005588920 nova_compute[226886]: 2026-01-20 15:05:21.985 226890 DEBUG nova.objects.instance [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.016 226890 DEBUG nova.virt.libvirt.driver [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attempting to attach volume d93c7cb1-0f58-41fe-b846-7d4d85545927 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.019 226890 DEBUG nova.virt.libvirt.guest [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d93c7cb1-0f58-41fe-b846-7d4d85545927">
Jan 20 10:05:22 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:05:22 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:22 np0005588920 nova_compute[226886]:  <serial>d93c7cb1-0f58-41fe-b846-7d4d85545927</serial>
Jan 20 10:05:22 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:22 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.360 226890 DEBUG nova.virt.libvirt.driver [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.361 226890 DEBUG nova.virt.libvirt.driver [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.361 226890 DEBUG nova.virt.libvirt.driver [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.362 226890 DEBUG nova.virt.libvirt.driver [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No VIF found with MAC fa:16:3e:98:e3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:05:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:22.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:22 np0005588920 nova_compute[226886]: 2026-01-20 15:05:22.682 226890 DEBUG oslo_concurrency.lockutils [None req-59782ff0-5433-4b02-a80c-20fa96626fcd 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:23 np0005588920 nova_compute[226886]: 2026-01-20 15:05:23.059 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921508.0577655, a0ce16c6-2b75-472f-a785-890fbb0d748e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:23 np0005588920 nova_compute[226886]: 2026-01-20 15:05:23.060 226890 INFO nova.compute.manager [-] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:05:23 np0005588920 nova_compute[226886]: 2026-01-20 15:05:23.086 226890 DEBUG nova.compute.manager [None req-359691d0-29c7-4175-830a-b262b4d96aac - - - - - -] [instance: a0ce16c6-2b75-472f-a785-890fbb0d748e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:23 np0005588920 nova_compute[226886]: 2026-01-20 15:05:23.189 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:23.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:24.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:25.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 20 10:05:26 np0005588920 nova_compute[226886]: 2026-01-20 15:05:26.382 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:26 np0005588920 nova_compute[226886]: 2026-01-20 15:05:26.382 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:26.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:26 np0005588920 nova_compute[226886]: 2026-01-20 15:05:26.571 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:26 np0005588920 nova_compute[226886]: 2026-01-20 15:05:26.587 226890 DEBUG nova.objects.instance [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:26 np0005588920 nova_compute[226886]: 2026-01-20 15:05:26.657 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.011 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.011 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.012 226890 INFO nova.compute.manager [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attaching volume b2752651-d07b-4c58-8781-1b28cd47b400 to /dev/vdc#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.221 226890 DEBUG os_brick.utils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.222 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.231 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.232 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[39a442e0-12f7-4d86-a4e8-98b2c8049f93]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.233 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.241 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.241 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec56580-fa86-45e3-bf8d-2edd46270be1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.242 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.250 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.251 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[300f49c0-57fd-42fb-b5fe-9b2ac5b6145d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.253 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[52154116-e649-4389-a411-0a75372d5585]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.253 226890 DEBUG oslo_concurrency.processutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.285 226890 DEBUG oslo_concurrency.processutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "nvme version" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.288 226890 DEBUG os_brick.initiator.connectors.lightos [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.288 226890 DEBUG os_brick.initiator.connectors.lightos [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.289 226890 DEBUG os_brick.initiator.connectors.lightos [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.289 226890 DEBUG os_brick.utils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:05:27 np0005588920 nova_compute[226886]: 2026-01-20 15:05:27.290 226890 DEBUG nova.virt.block_device [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating existing volume attachment record: 82c493aa-2be0-4e8e-8cb6-f8854e8da0d8 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:05:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:27.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3884054008' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.190 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.397 226890 DEBUG nova.objects.instance [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.436 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attempting to attach volume b2752651-d07b-4c58-8781-1b28cd47b400 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.438 226890 DEBUG nova.virt.libvirt.guest [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-b2752651-d07b-4c58-8781-1b28cd47b400">
Jan 20 10:05:28 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:05:28 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 10:05:28 np0005588920 nova_compute[226886]:  <serial>b2752651-d07b-4c58-8781-1b28cd47b400</serial>
Jan 20 10:05:28 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:28 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:05:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:28.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.610 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.611 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.611 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.611 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:28 np0005588920 nova_compute[226886]: 2026-01-20 15:05:28.611 226890 DEBUG nova.virt.libvirt.driver [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] No VIF found with MAC fa:16:3e:98:e3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:05:29 np0005588920 nova_compute[226886]: 2026-01-20 15:05:29.056 226890 DEBUG oslo_concurrency.lockutils [None req-7eb676e0-212c-4327-81bc-e9e2131f216a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:29.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:30.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.001 226890 DEBUG oslo_concurrency.lockutils [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.002 226890 DEBUG oslo_concurrency.lockutils [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.023 226890 INFO nova.compute.manager [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Detaching volume d93c7cb1-0f58-41fe-b846-7d4d85545927#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.242 226890 INFO nova.virt.block_device [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attempting to driver detach volume d93c7cb1-0f58-41fe-b846-7d4d85545927 from mountpoint /dev/vdb#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.251 226890 DEBUG nova.virt.libvirt.driver [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Attempting to detach device vdb from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.252 226890 DEBUG nova.virt.libvirt.guest [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d93c7cb1-0f58-41fe-b846-7d4d85545927">
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <serial>d93c7cb1-0f58-41fe-b846-7d4d85545927</serial>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:31 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.259 226890 INFO nova.virt.libvirt.driver [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdb from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the persistent domain config.#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.260 226890 DEBUG nova.virt.libvirt.driver [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.260 226890 DEBUG nova.virt.libvirt.guest [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-d93c7cb1-0f58-41fe-b846-7d4d85545927">
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <serial>d93c7cb1-0f58-41fe-b846-7d4d85545927</serial>
Jan 20 10:05:31 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:05:31 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:31 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.310 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921531.3099658, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.312 226890 DEBUG nova.virt.libvirt.driver [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.314 226890 INFO nova.virt.libvirt.driver [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdb from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the live domain config.#033[00m
Jan 20 10:05:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:31.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.573 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.623 226890 DEBUG nova.objects.instance [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:31 np0005588920 nova_compute[226886]: 2026-01-20 15:05:31.689 226890 DEBUG oslo_concurrency.lockutils [None req-20958361-b786-4c15-b4f9-77f3fdf0389a 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:32.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:33Z|00755|binding|INFO|Releasing lport 58f1013f-2d8d-46a7-97e6-2062537e7f1a from this chassis (sb_readonly=0)
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.192 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:33.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.607 226890 DEBUG oslo_concurrency.lockutils [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.608 226890 DEBUG oslo_concurrency.lockutils [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.627 226890 INFO nova.compute.manager [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Detaching volume b2752651-d07b-4c58-8781-1b28cd47b400#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.880 226890 INFO nova.virt.block_device [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Attempting to driver detach volume b2752651-d07b-4c58-8781-1b28cd47b400 from mountpoint /dev/vdc#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.886 226890 DEBUG nova.virt.libvirt.driver [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Attempting to detach device vdc from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.887 226890 DEBUG nova.virt.libvirt.guest [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-b2752651-d07b-4c58-8781-1b28cd47b400">
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <serial>b2752651-d07b-4c58-8781-1b28cd47b400</serial>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:33 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.892 226890 INFO nova.virt.libvirt.driver [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdc from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the persistent domain config.#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.893 226890 DEBUG nova.virt.libvirt.driver [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.893 226890 DEBUG nova.virt.libvirt.guest [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-b2752651-d07b-4c58-8781-1b28cd47b400">
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <target dev="vdc" bus="virtio"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <serial>b2752651-d07b-4c58-8781-1b28cd47b400</serial>
Jan 20 10:05:33 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 10:05:33 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:05:33 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.943 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921533.9433842, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.945 226890 DEBUG nova.virt.libvirt.driver [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:05:33 np0005588920 nova_compute[226886]: 2026-01-20 15:05:33.947 226890 INFO nova.virt.libvirt.driver [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully detached device vdc from instance 3e59fbab-2129-45b7-8fb1-997b2ccede64 from the live domain config.#033[00m
Jan 20 10:05:34 np0005588920 nova_compute[226886]: 2026-01-20 15:05:34.244 226890 DEBUG nova.objects.instance [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'flavor' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:34 np0005588920 nova_compute[226886]: 2026-01-20 15:05:34.357 226890 DEBUG oslo_concurrency.lockutils [None req-57268b41-f2b1-4bd0-8a5f-7afa3c7d8edf 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:34.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:35.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:36.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.544 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.544 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.545 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.545 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.545 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.546 226890 INFO nova.compute.manager [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Terminating instance#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.547 226890 DEBUG nova.compute.manager [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 kernel: tap8c5a8745-5b (unregistering): left promiscuous mode
Jan 20 10:05:36 np0005588920 NetworkManager[49076]: <info>  [1768921536.6025] device (tap8c5a8745-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:36Z|00756|binding|INFO|Releasing lport 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 from this chassis (sb_readonly=0)
Jan 20 10:05:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:36Z|00757|binding|INFO|Setting lport 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 down in Southbound
Jan 20 10:05:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:05:36Z|00758|binding|INFO|Removing iface tap8c5a8745-5b ovn-installed in OVS
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.614 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.624 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:e3:f6 10.100.0.13'], port_security=['fa:16:3e:98:e3:f6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e59fbab-2129-45b7-8fb1-997b2ccede64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f7b14c2a9348f08305fe232df2a603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '14a85866-1e42-4f6c-80fa-7b6fb27c4433', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b54ff9-b8ec-4b9d-ab83-0d9fa6361dd1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=8c5a8745-5b2f-47c8-9968-acd29a3f46c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.626 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 8c5a8745-5b2f-47c8-9968-acd29a3f46c6 in datapath 89fdd65f-3dd2-4375-a946-3c5de73cc24a unbound from our chassis#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.627 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89fdd65f-3dd2-4375-a946-3c5de73cc24a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.628 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30556961-cadf-4b7c-840c-0d331b60fb3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.629 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a namespace which is not needed anymore#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Jan 20 10:05:36 np0005588920 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a0.scope: Consumed 15.039s CPU time.
Jan 20 10:05:36 np0005588920 systemd-machined[196121]: Machine qemu-76-instance-000000a0 terminated.
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [NOTICE]   (286803) : haproxy version is 2.8.14-c23fe91
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [NOTICE]   (286803) : path to executable is /usr/sbin/haproxy
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [WARNING]  (286803) : Exiting Master process...
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [WARNING]  (286803) : Exiting Master process...
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [ALERT]    (286803) : Current worker (286805) exited with code 143 (Terminated)
Jan 20 10:05:36 np0005588920 neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a[286799]: [WARNING]  (286803) : All workers exited. Exiting... (0)
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.764 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 systemd[1]: libpod-21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9.scope: Deactivated successfully.
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.769 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 podman[287112]: 2026-01-20 15:05:36.772479098 +0000 UTC m=+0.049567766 container died 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.782 226890 INFO nova.virt.libvirt.driver [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Instance destroyed successfully.#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.782 226890 DEBUG nova.objects.instance [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lazy-loading 'resources' on Instance uuid 3e59fbab-2129-45b7-8fb1-997b2ccede64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.802 226890 DEBUG nova.virt.libvirt.vif [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2146772163',display_name='tempest-AttachVolumeTestJSON-server-2146772163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-2146772163',id=160,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMShu/UKrpC8IxRbXkByseIIDJT578k3TS0wOkHyBL1Nfel3atiUiXbZZQd23fr6BcQS57L5ztA9MT+neK/RSmXp3/2MHpk0f5u9h29ogwqYigXBQGeq9oHbFQrdd9SSSQ==',key_name='tempest-keypair-865842409',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:04:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96f7b14c2a9348f08305fe232df2a603',ramdisk_id='',reservation_id='r-cadrdpy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-583320363',owner_user_name='tempest-AttachVolumeTestJSON-583320363-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:04:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='912329b1a6ad42bdb72e952c03983bdf',uuid=3e59fbab-2129-45b7-8fb1-997b2ccede64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.803 226890 DEBUG nova.network.os_vif_util [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converting VIF {"id": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "address": "fa:16:3e:98:e3:f6", "network": {"id": "89fdd65f-3dd2-4375-a946-3c5de73cc24a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1916368729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f7b14c2a9348f08305fe232df2a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c5a8745-5b", "ovs_interfaceid": "8c5a8745-5b2f-47c8-9968-acd29a3f46c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:36 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9-userdata-shm.mount: Deactivated successfully.
Jan 20 10:05:36 np0005588920 systemd[1]: var-lib-containers-storage-overlay-ad86864a5a531ad1de5aa2b8828511b47f464a558e851c605ee03237d7919f26-merged.mount: Deactivated successfully.
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.807 226890 DEBUG nova.network.os_vif_util [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.808 226890 DEBUG os_vif [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.810 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.810 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c5a8745-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.811 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 podman[287112]: 2026-01-20 15:05:36.813363396 +0000 UTC m=+0.090452044 container cleanup 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.814 226890 INFO os_vif [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:e3:f6,bridge_name='br-int',has_traffic_filtering=True,id=8c5a8745-5b2f-47c8-9968-acd29a3f46c6,network=Network(89fdd65f-3dd2-4375-a946-3c5de73cc24a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c5a8745-5b')#033[00m
Jan 20 10:05:36 np0005588920 systemd[1]: libpod-conmon-21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9.scope: Deactivated successfully.
Jan 20 10:05:36 np0005588920 podman[287159]: 2026-01-20 15:05:36.877131996 +0000 UTC m=+0.038706716 container remove 21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.882 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[872118d0-c0d3-4022-81ab-fd30218ca94f]: (4, ('Tue Jan 20 03:05:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9)\n21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9\nTue Jan 20 03:05:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a (21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9)\n21d3f2073b32b7051b3f1a5fbb6a70f6e0d0336a1eacec845dd65e7d0809ffd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.884 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6628c1ca-573b-47dc-98de-a016c39fb06a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.884 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89fdd65f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:36 np0005588920 kernel: tap89fdd65f-30: left promiscuous mode
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.887 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 nova_compute[226886]: 2026-01-20 15:05:36.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.902 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[63d9465b-1303-4fd8-89d7-c22f6fb6fa2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.920 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[22bee007-db82-4066-bcf5-f02f6358052d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.922 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b08a86-0460-4435-ac6b-a53819b30e78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.938 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[20185a7f-636a-44b2-b981-0fb69887e043]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653236, 'reachable_time': 16362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287185, 'error': None, 'target': 'ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.940 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89fdd65f-3dd2-4375-a946-3c5de73cc24a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:05:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:36.941 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[d51b5c09-15d1-446a-9b67-6f1982d46f17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:36 np0005588920 systemd[1]: run-netns-ovnmeta\x2d89fdd65f\x2d3dd2\x2d4375\x2da946\x2d3c5de73cc24a.mount: Deactivated successfully.
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.278 226890 INFO nova.virt.libvirt.driver [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Deleting instance files /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64_del#033[00m
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.280 226890 INFO nova.virt.libvirt.driver [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Deletion of /var/lib/nova/instances/3e59fbab-2129-45b7-8fb1-997b2ccede64_del complete#033[00m
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.381 226890 INFO nova.compute.manager [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.382 226890 DEBUG oslo.service.loopingcall [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.382 226890 DEBUG nova.compute.manager [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:05:37 np0005588920 nova_compute[226886]: 2026-01-20 15:05:37.382 226890 DEBUG nova.network.neutron [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:05:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:37.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.183 226890 DEBUG nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-unplugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.184 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.184 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.185 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.185 226890 DEBUG nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] No waiting events found dispatching network-vif-unplugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.186 226890 DEBUG nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-unplugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.186 226890 DEBUG nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.187 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.187 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.187 226890 DEBUG oslo_concurrency.lockutils [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.188 226890 DEBUG nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] No waiting events found dispatching network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:38 np0005588920 nova_compute[226886]: 2026-01-20 15:05:38.188 226890 WARNING nova.compute.manager [req-fbb92971-44d7-4ce9-b685-acb6762c5cb8 req-b4f7e989-a505-42ef-8116-ad7fe6357067 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received unexpected event network-vif-plugged-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:05:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:38.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:38 np0005588920 podman[287211]: 2026-01-20 15:05:38.793923914 +0000 UTC m=+0.083951037 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:05:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:39.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:05:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:40.341 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:40.344 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.352 226890 DEBUG nova.network.neutron [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.395 226890 INFO nova.compute.manager [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Took 3.01 seconds to deallocate network for instance.#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.464 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.465 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:40.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.534 226890 DEBUG oslo_concurrency.processutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.576 226890 DEBUG nova.compute.manager [req-2fff9bfe-6f73-42e5-a016-e65a9d443e8b req-e4d5b39c-d598-440c-a834-d30dae6decde 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Received event network-vif-deleted-8c5a8745-5b2f-47c8-9968-acd29a3f46c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:40 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1135024478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.986 226890 DEBUG oslo_concurrency.processutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:40 np0005588920 nova_compute[226886]: 2026-01-20 15:05:40.991 226890 DEBUG nova.compute.provider_tree [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.011 226890 DEBUG nova.scheduler.client.report [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.043 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.085 226890 INFO nova.scheduler.client.report [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Deleted allocations for instance 3e59fbab-2129-45b7-8fb1-997b2ccede64#033[00m
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.191 226890 DEBUG oslo_concurrency.lockutils [None req-d39569d2-c772-47f0-9f0e-b4e84889a43b 912329b1a6ad42bdb72e952c03983bdf 96f7b14c2a9348f08305fe232df2a603 - - default default] Lock "3e59fbab-2129-45b7-8fb1-997b2ccede64" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:41.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:41 np0005588920 nova_compute[226886]: 2026-01-20 15:05:41.813 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:42.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:43 np0005588920 nova_compute[226886]: 2026-01-20 15:05:43.101 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:43.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:05:44.349 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:44.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:05:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/186537975' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:05:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:05:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/186537975' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:05:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:45.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:46.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:46 np0005588920 nova_compute[226886]: 2026-01-20 15:05:46.580 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:46 np0005588920 nova_compute[226886]: 2026-01-20 15:05:46.815 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:47.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:48 np0005588920 nova_compute[226886]: 2026-01-20 15:05:48.044 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:48.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:48 np0005588920 podman[287416]: 2026-01-20 15:05:48.991095144 +0000 UTC m=+0.073841648 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:05:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:49.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:50.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:50 np0005588920 nova_compute[226886]: 2026-01-20 15:05:50.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:51.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.756 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.781 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921536.7798412, 3e59fbab-2129-45b7-8fb1-997b2ccede64 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.781 226890 INFO nova.compute.manager [-] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.818 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:51 np0005588920 nova_compute[226886]: 2026-01-20 15:05:51.824 226890 DEBUG nova.compute.manager [None req-bef488fe-c7c4-4b18-b939-e006770b094c - - - - - -] [instance: 3e59fbab-2129-45b7-8fb1-997b2ccede64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:52.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3737704240' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.354 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.355 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.384 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.463 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.463 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.470 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.471 226890 INFO nova.compute.claims [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.493 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.595 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:53 np0005588920 nova_compute[226886]: 2026-01-20 15:05:53.811 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2589043131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.041 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.047 226890 DEBUG nova.compute.provider_tree [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.061 226890 DEBUG nova.scheduler.client.report [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.091 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.092 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.159 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.160 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.184 226890 INFO nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.210 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.454 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.456 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.456 226890 INFO nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Creating image(s)#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.481 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.511 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:54.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.536 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.539 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.611 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.612 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.612 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.613 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.639 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.642 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.972 226890 DEBUG nova.policy [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2beb3d6247e457abd6e8d93cc602f02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5e161d5a47f845fd89eb3f10627a0830', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:05:54 np0005588920 nova_compute[226886]: 2026-01-20 15:05:54.998 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.356s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.060 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] resizing rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.341 226890 DEBUG nova.objects.instance [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'migration_context' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.366 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.367 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Ensure instance console log exists: /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.367 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.368 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.368 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.590 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Successfully created port: 69adac22-9a54-4fc6-a0ad-9775bd380e99 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:05:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.740 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.740 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.741 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.741 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:05:55 np0005588920 nova_compute[226886]: 2026-01-20 15:05:55.741 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2070363681' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.180 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.347 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.349 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4177MB free_disk=20.942607879638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.349 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.350 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.471 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance beaa17a1-aac9-450b-8036-a3b2b9e10bb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.471 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.471 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.501 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.563 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Successfully updated port: 69adac22-9a54-4fc6-a0ad-9775bd380e99 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.602 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.602 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquired lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.602 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.749 226890 DEBUG nova.compute.manager [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-changed-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.750 226890 DEBUG nova.compute.manager [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Refreshing instance network info cache due to event network-changed-69adac22-9a54-4fc6-a0ad-9775bd380e99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.750 226890 DEBUG oslo_concurrency.lockutils [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.819 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.892 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:05:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3670247231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.954 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.960 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.973 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.995 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:05:56 np0005588920 nova_compute[226886]: 2026-01-20 15:05:56.996 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:57 np0005588920 nova_compute[226886]: 2026-01-20 15:05:57.996 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:57 np0005588920 nova_compute[226886]: 2026-01-20 15:05:57.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:57 np0005588920 nova_compute[226886]: 2026-01-20 15:05:57.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:57 np0005588920 nova_compute[226886]: 2026-01-20 15:05:57.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:57 np0005588920 nova_compute[226886]: 2026-01-20 15:05:57.997 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.180 226890 DEBUG nova.network.neutron [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updating instance_info_cache with network_info: [{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.209 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Releasing lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.210 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance network_info: |[{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.210 226890 DEBUG oslo_concurrency.lockutils [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.210 226890 DEBUG nova.network.neutron [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Refreshing network info cache for port 69adac22-9a54-4fc6-a0ad-9775bd380e99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.213 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start _get_guest_xml network_info=[{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.217 226890 WARNING nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.220 226890 DEBUG nova.virt.libvirt.host [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.221 226890 DEBUG nova.virt.libvirt.host [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.224 226890 DEBUG nova.virt.libvirt.host [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.224 226890 DEBUG nova.virt.libvirt.host [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.225 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.225 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.226 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.226 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.226 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.226 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.226 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.227 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.227 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.227 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.227 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.228 226890 DEBUG nova.virt.hardware [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.230 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:58.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:58 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3048424037' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.668 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.690 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.694 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:58 np0005588920 nova_compute[226886]: 2026-01-20 15:05:58.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2837693437' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.548 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.855s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.550 226890 DEBUG nova.virt.libvirt.vif [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-2068211489',display_name='tempest-ServerRescueTestJSON-server-2068211489',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-2068211489',id=163,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-5fu5wq70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:05:54Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=beaa17a1-aac9-450b-8036-a3b2b9e10bb3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.550 226890 DEBUG nova.network.os_vif_util [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.551 226890 DEBUG nova.network.os_vif_util [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.552 226890 DEBUG nova.objects.instance [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'pci_devices' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.572 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <uuid>beaa17a1-aac9-450b-8036-a3b2b9e10bb3</uuid>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <name>instance-000000a3</name>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueTestJSON-server-2068211489</nova:name>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:05:58</nova:creationTime>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:user uuid="a2beb3d6247e457abd6e8d93cc602f02">tempest-ServerRescueTestJSON-1151598672-project-member</nova:user>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:project uuid="5e161d5a47f845fd89eb3f10627a0830">tempest-ServerRescueTestJSON-1151598672</nova:project>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <nova:port uuid="69adac22-9a54-4fc6-a0ad-9775bd380e99">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="serial">beaa17a1-aac9-450b-8036-a3b2b9e10bb3</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="uuid">beaa17a1-aac9-450b-8036-a3b2b9e10bb3</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:c5:49:ce"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <target dev="tap69adac22-9a"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/console.log" append="off"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:05:59 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:05:59 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:05:59 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:05:59 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.574 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Preparing to wait for external event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.574 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.574 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.575 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.575 226890 DEBUG nova.virt.libvirt.vif [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-2068211489',display_name='tempest-ServerRescueTestJSON-server-2068211489',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-2068211489',id=163,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-5fu5wq70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:05:54Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=beaa17a1-aac9-450b-8036-a3b2b9e10bb3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.576 226890 DEBUG nova.network.os_vif_util [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.576 226890 DEBUG nova.network.os_vif_util [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.577 226890 DEBUG os_vif [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.577 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.578 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.578 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.580 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.581 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69adac22-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.581 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69adac22-9a, col_values=(('external_ids', {'iface-id': '69adac22-9a54-4fc6-a0ad-9775bd380e99', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:49:ce', 'vm-uuid': 'beaa17a1-aac9-450b-8036-a3b2b9e10bb3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.583 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:59 np0005588920 NetworkManager[49076]: <info>  [1768921559.5836] manager: (tap69adac22-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.586 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.589 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.590 226890 INFO os_vif [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a')#033[00m
Jan 20 10:05:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:05:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:59.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.650 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.650 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.650 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No VIF found with MAC fa:16:3e:c5:49:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.651 226890 INFO nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Using config drive#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.678 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.945 226890 DEBUG nova.network.neutron [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updated VIF entry in instance network info cache for port 69adac22-9a54-4fc6-a0ad-9775bd380e99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:05:59 np0005588920 nova_compute[226886]: 2026-01-20 15:05:59.945 226890 DEBUG nova.network.neutron [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updating instance_info_cache with network_info: [{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.159 226890 DEBUG oslo_concurrency.lockutils [req-563e2a55-dca3-41b2-8fca-014e9a26b7b8 req-ac6689d4-73fc-44b6-b9f9-5e46e5214d41 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.290 226890 INFO nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Creating config drive at /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.294 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxtob0ljx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.428 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxtob0ljx" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.464 226890 DEBUG nova.storage.rbd_utils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.469 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:00.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.651 226890 DEBUG oslo_concurrency.processutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.652 226890 INFO nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Deleting local config drive /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config because it was imported into RBD.#033[00m
Jan 20 10:06:00 np0005588920 kernel: tap69adac22-9a: entered promiscuous mode
Jan 20 10:06:00 np0005588920 NetworkManager[49076]: <info>  [1768921560.7109] manager: (tap69adac22-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/367)
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.712 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:00Z|00759|binding|INFO|Claiming lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 for this chassis.
Jan 20 10:06:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:00Z|00760|binding|INFO|69adac22-9a54-4fc6-a0ad-9775bd380e99: Claiming fa:16:3e:c5:49:ce 10.100.0.11
Jan 20 10:06:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:00.726 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:49:ce 10.100.0.11'], port_security=['fa:16:3e:c5:49:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'beaa17a1-aac9-450b-8036-a3b2b9e10bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=69adac22-9a54-4fc6-a0ad-9775bd380e99) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:00.727 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 69adac22-9a54-4fc6-a0ad-9775bd380e99 in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 bound to our chassis#033[00m
Jan 20 10:06:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:00.728 144128 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:06:00 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:00.729 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b779b525-98ce-4aaa-8a06-8065ff0a1ef5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:00 np0005588920 systemd-machined[196121]: New machine qemu-77-instance-000000a3.
Jan 20 10:06:00 np0005588920 systemd[1]: Started Virtual Machine qemu-77-instance-000000a3.
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.799 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:00Z|00761|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 ovn-installed in OVS
Jan 20 10:06:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:00Z|00762|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 up in Southbound
Jan 20 10:06:00 np0005588920 nova_compute[226886]: 2026-01-20 15:06:00.807 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:00 np0005588920 systemd-udevd[287806]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:06:00 np0005588920 NetworkManager[49076]: <info>  [1768921560.8293] device (tap69adac22-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:06:00 np0005588920 NetworkManager[49076]: <info>  [1768921560.8299] device (tap69adac22-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.457 226890 DEBUG nova.compute.manager [req-a299a7ab-743d-4272-bbc7-34b008df6ba1 req-bce501d5-e13a-4d1b-8ada-c8433294dd44 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.459 226890 DEBUG oslo_concurrency.lockutils [req-a299a7ab-743d-4272-bbc7-34b008df6ba1 req-bce501d5-e13a-4d1b-8ada-c8433294dd44 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.459 226890 DEBUG oslo_concurrency.lockutils [req-a299a7ab-743d-4272-bbc7-34b008df6ba1 req-bce501d5-e13a-4d1b-8ada-c8433294dd44 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.460 226890 DEBUG oslo_concurrency.lockutils [req-a299a7ab-743d-4272-bbc7-34b008df6ba1 req-bce501d5-e13a-4d1b-8ada-c8433294dd44 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.460 226890 DEBUG nova.compute.manager [req-a299a7ab-743d-4272-bbc7-34b008df6ba1 req-bce501d5-e13a-4d1b-8ada-c8433294dd44 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Processing event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.461 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921561.4526122, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.462 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Started (Lifecycle Event)#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.464 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.468 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.472 226890 INFO nova.virt.libvirt.driver [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance spawned successfully.#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.473 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.482 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.489 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.497 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.498 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.499 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.499 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.500 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.500 226890 DEBUG nova.virt.libvirt.driver [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.507 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.507 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921561.4580438, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.508 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.536 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.540 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921561.468058, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.541 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.556 226890 INFO nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.556 226890 DEBUG nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.560 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.566 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.595 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:06:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:01.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.626 226890 INFO nova.compute.manager [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Took 8.19 seconds to build instance.#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:01 np0005588920 nova_compute[226886]: 2026-01-20 15:06:01.676 226890 DEBUG oslo_concurrency.lockutils [None req-71d98195-54b3-4c1e-a499-c055c2b2b390 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:02 np0005588920 nova_compute[226886]: 2026-01-20 15:06:02.272 226890 INFO nova.compute.manager [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Rescuing#033[00m
Jan 20 10:06:02 np0005588920 nova_compute[226886]: 2026-01-20 15:06:02.273 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:02 np0005588920 nova_compute[226886]: 2026-01-20 15:06:02.273 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquired lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:02 np0005588920 nova_compute[226886]: 2026-01-20 15:06:02.273 226890 DEBUG nova.network.neutron [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:06:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:02.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 20 10:06:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.413 226890 DEBUG nova.network.neutron [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updating instance_info_cache with network_info: [{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.435 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Releasing lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.539 226890 DEBUG nova.compute.manager [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.539 226890 DEBUG oslo_concurrency.lockutils [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.539 226890 DEBUG oslo_concurrency.lockutils [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.539 226890 DEBUG oslo_concurrency.lockutils [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.540 226890 DEBUG nova.compute.manager [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.540 226890 WARNING nova.compute.manager [req-51ed40c8-7563-4e6c-9233-8470cb0663c8 req-b3edc458-43fc-4f04-ba94-5df4512eaaf0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:06:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:03.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 20 10:06:03 np0005588920 nova_compute[226886]: 2026-01-20 15:06:03.728 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:06:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:04.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:04 np0005588920 nova_compute[226886]: 2026-01-20 15:06:04.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 20 10:06:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:06.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:06 np0005588920 nova_compute[226886]: 2026-01-20 15:06:06.634 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:07.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:08.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:08 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 20 10:06:09 np0005588920 podman[287857]: 2026-01-20 15:06:09.044111619 +0000 UTC m=+0.115467947 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 10:06:09 np0005588920 nova_compute[226886]: 2026-01-20 15:06:09.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:09.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 20 10:06:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:11.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:11 np0005588920 nova_compute[226886]: 2026-01-20 15:06:11.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:12.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:13 np0005588920 nova_compute[226886]: 2026-01-20 15:06:13.768 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:06:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:14.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:14 np0005588920 nova_compute[226886]: 2026-01-20 15:06:14.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:15.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:16.466 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:16.467 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:16.467 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:16.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:16 np0005588920 nova_compute[226886]: 2026-01-20 15:06:16.641 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:17.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:17 np0005588920 kernel: tap69adac22-9a (unregistering): left promiscuous mode
Jan 20 10:06:17 np0005588920 NetworkManager[49076]: <info>  [1768921577.9210] device (tap69adac22-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.931 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:17 np0005588920 nova_compute[226886]: 2026-01-20 15:06:17.931 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.932 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.933 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:17 np0005588920 nova_compute[226886]: 2026-01-20 15:06:17.934 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:17Z|00763|binding|INFO|Releasing lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 from this chassis (sb_readonly=1)
Jan 20 10:06:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:17Z|00764|binding|INFO|Removing iface tap69adac22-9a ovn-installed in OVS
Jan 20 10:06:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:17Z|00765|if_status|INFO|Dropped 2 log messages in last 1158 seconds (most recently, 1158 seconds ago) due to excessive rate
Jan 20 10:06:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:17Z|00766|if_status|INFO|Not setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 down as sb is readonly
Jan 20 10:06:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:17Z|00767|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 down in Southbound
Jan 20 10:06:17 np0005588920 nova_compute[226886]: 2026-01-20 15:06:17.949 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.961 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:49:ce 10.100.0.11'], port_security=['fa:16:3e:c5:49:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'beaa17a1-aac9-450b-8036-a3b2b9e10bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=69adac22-9a54-4fc6-a0ad-9775bd380e99) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.964 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 69adac22-9a54-4fc6-a0ad-9775bd380e99 in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 unbound from our chassis#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.966 144128 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:06:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:17.967 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf560c9-94a7-44fa-90c5-875faf038555]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:17 np0005588920 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 20 10:06:17 np0005588920 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a3.scope: Consumed 13.721s CPU time.
Jan 20 10:06:17 np0005588920 systemd-machined[196121]: Machine qemu-77-instance-000000a3 terminated.
Jan 20 10:06:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:18.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.792 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance shutdown successfully after 15 seconds.#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.798 226890 INFO nova.virt.libvirt.driver [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance destroyed successfully.#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.798 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'numa_topology' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.815 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Attempting rescue#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.816 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.819 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.819 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Creating image(s)#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.842 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.845 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'trusted_certs' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.880 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.903 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.906 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.966 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.967 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.968 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.968 226890 DEBUG oslo_concurrency.lockutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.989 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:18 np0005588920 nova_compute[226886]: 2026-01-20 15:06:18.992 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.082 226890 DEBUG nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.083 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.083 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.084 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.084 226890 DEBUG nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.084 226890 WARNING nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.084 226890 DEBUG nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.085 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.085 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.085 226890 DEBUG oslo_concurrency.lockutils [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.085 226890 DEBUG nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.086 226890 WARNING nova.compute.manager [req-d1776488-9eb7-43bd-b6e4-9ca6672cd45b req-b99b0e87-ea34-43a1-9c48-3ffc40f0d9c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.281 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.282 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'migration_context' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.300 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.300 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start _get_guest_xml network_info=[{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:c5:49:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.301 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'resources' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.326 226890 WARNING nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.335 226890 DEBUG nova.virt.libvirt.host [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.336 226890 DEBUG nova.virt.libvirt.host [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.341 226890 DEBUG nova.virt.libvirt.host [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.342 226890 DEBUG nova.virt.libvirt.host [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.343 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.343 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.344 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.345 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.345 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.345 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.346 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.346 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.347 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.347 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.347 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.348 226890 DEBUG nova.virt.hardware [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.348 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'vcpu_model' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.375 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:19.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2692737353' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.822 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:19 np0005588920 nova_compute[226886]: 2026-01-20 15:06:19.823 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:19 np0005588920 podman[288019]: 2026-01-20 15:06:19.977021868 +0000 UTC m=+0.056195095 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2544975497' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.312 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.315 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:20.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1540173418' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.754 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.756 226890 DEBUG nova.virt.libvirt.vif [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-2068211489',display_name='tempest-ServerRescueTestJSON-server-2068211489',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-2068211489',id=163,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-5fu5wq70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:06:01Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=beaa17a1-aac9-450b-8036-a3b2b9e10bb3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:c5:49:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.757 226890 DEBUG nova.network.os_vif_util [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:c5:49:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.758 226890 DEBUG nova.network.os_vif_util [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.759 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'pci_devices' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.779 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <uuid>beaa17a1-aac9-450b-8036-a3b2b9e10bb3</uuid>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <name>instance-000000a3</name>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueTestJSON-server-2068211489</nova:name>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:06:19</nova:creationTime>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:user uuid="a2beb3d6247e457abd6e8d93cc602f02">tempest-ServerRescueTestJSON-1151598672-project-member</nova:user>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:project uuid="5e161d5a47f845fd89eb3f10627a0830">tempest-ServerRescueTestJSON-1151598672</nova:project>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <nova:port uuid="69adac22-9a54-4fc6-a0ad-9775bd380e99">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="serial">beaa17a1-aac9-450b-8036-a3b2b9e10bb3</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="uuid">beaa17a1-aac9-450b-8036-a3b2b9e10bb3</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.rescue">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config.rescue">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:c5:49:ce"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <target dev="tap69adac22-9a"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/console.log" append="off"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:06:20 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:06:20 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:06:20 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:06:20 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.788 226890 INFO nova.virt.libvirt.driver [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance destroyed successfully.#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.836 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.837 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.837 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.838 226890 DEBUG nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No VIF found with MAC fa:16:3e:c5:49:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.840 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Using config drive#033[00m
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.869351) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580869508, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1764, "num_deletes": 265, "total_data_size": 3693740, "memory_usage": 3754272, "flush_reason": "Manual Compaction"}
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.886 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580891222, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2422490, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58229, "largest_seqno": 59988, "table_properties": {"data_size": 2415041, "index_size": 4327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16450, "raw_average_key_size": 20, "raw_value_size": 2399765, "raw_average_value_size": 2999, "num_data_blocks": 189, "num_entries": 800, "num_filter_entries": 800, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921455, "oldest_key_time": 1768921455, "file_creation_time": 1768921580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 21924 microseconds, and 12093 cpu microseconds.
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.891275) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2422490 bytes OK
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.891300) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.893604) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.893620) EVENT_LOG_v1 {"time_micros": 1768921580893615, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.893637) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3685531, prev total WAL file size 3685531, number of live WAL files 2.
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.894652) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303137' seq:72057594037927935, type:22 .. '6C6F676D0032323731' seq:0, type:0; will stop at (end)
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2365KB)], [114(10071KB)]
Jan 20 10:06:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580894713, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 12735673, "oldest_snapshot_seqno": -1}
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.906 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'ec2_ids' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:20 np0005588920 nova_compute[226886]: 2026-01-20 15:06:20.949 226890 DEBUG nova.objects.instance [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'keypairs' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8595 keys, 12588731 bytes, temperature: kUnknown
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581013378, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 12588731, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12530975, "index_size": 35180, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21509, "raw_key_size": 222764, "raw_average_key_size": 25, "raw_value_size": 12377693, "raw_average_value_size": 1440, "num_data_blocks": 1380, "num_entries": 8595, "num_filter_entries": 8595, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.014059) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 12588731 bytes
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.038943) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.9 rd, 105.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 9.8 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 9139, records dropped: 544 output_compression: NoCompression
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.038984) EVENT_LOG_v1 {"time_micros": 1768921581038969, "job": 72, "event": "compaction_finished", "compaction_time_micros": 119161, "compaction_time_cpu_micros": 28540, "output_level": 6, "num_output_files": 1, "total_output_size": 12588731, "num_input_records": 9139, "num_output_records": 8595, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581040038, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581042119, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:20.894565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.042342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.042348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.042350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.042353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:06:21.042355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.586 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Creating config drive at /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.595 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqr3io9k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:21.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.755 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqr3io9k" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.781 226890 DEBUG nova.storage.rbd_utils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.785 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.928 226890 DEBUG oslo_concurrency.processutils [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue beaa17a1-aac9-450b-8036-a3b2b9e10bb3_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.929 226890 INFO nova.virt.libvirt.driver [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Deleting local config drive /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:06:21 np0005588920 NetworkManager[49076]: <info>  [1768921581.9763] manager: (tap69adac22-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Jan 20 10:06:21 np0005588920 kernel: tap69adac22-9a: entered promiscuous mode
Jan 20 10:06:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:21Z|00768|binding|INFO|Claiming lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 for this chassis.
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.979 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:21Z|00769|binding|INFO|69adac22-9a54-4fc6-a0ad-9775bd380e99: Claiming fa:16:3e:c5:49:ce 10.100.0.11
Jan 20 10:06:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:21.987 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:49:ce 10.100.0.11'], port_security=['fa:16:3e:c5:49:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'beaa17a1-aac9-450b-8036-a3b2b9e10bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=69adac22-9a54-4fc6-a0ad-9775bd380e99) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:21.989 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 69adac22-9a54-4fc6-a0ad-9775bd380e99 in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 bound to our chassis#033[00m
Jan 20 10:06:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:21.990 144128 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:06:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:06:21.991 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe7895b-24dd-47db-b57a-532aea655e52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:21Z|00770|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 ovn-installed in OVS
Jan 20 10:06:21 np0005588920 ovn_controller[133971]: 2026-01-20T15:06:21Z|00771|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 up in Southbound
Jan 20 10:06:21 np0005588920 nova_compute[226886]: 2026-01-20 15:06:21.998 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.001 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:22 np0005588920 systemd-udevd[288154]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:06:22 np0005588920 systemd-machined[196121]: New machine qemu-78-instance-000000a3.
Jan 20 10:06:22 np0005588920 NetworkManager[49076]: <info>  [1768921582.0189] device (tap69adac22-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:06:22 np0005588920 NetworkManager[49076]: <info>  [1768921582.0198] device (tap69adac22-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:06:22 np0005588920 systemd[1]: Started Virtual Machine qemu-78-instance-000000a3.
Jan 20 10:06:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:22.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.815 226890 DEBUG nova.compute.manager [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.816 226890 DEBUG oslo_concurrency.lockutils [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.816 226890 DEBUG oslo_concurrency.lockutils [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.816 226890 DEBUG oslo_concurrency.lockutils [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.816 226890 DEBUG nova.compute.manager [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.817 226890 WARNING nova.compute.manager [req-58da2637-806c-4ec0-ae1f-12b863ba5b5b req-9307e0c1-61e4-4c28-a3fd-4748d8f677ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.904 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for beaa17a1-aac9-450b-8036-a3b2b9e10bb3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.905 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921582.9040053, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.905 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.910 226890 DEBUG nova.compute.manager [None req-51cae1e4-6e15-4e0e-8e33-af4a0fd0751c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.946 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.949 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.984 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.984 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921582.905049, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:22 np0005588920 nova_compute[226886]: 2026-01-20 15:06:22.985 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Started (Lifecycle Event)#033[00m
Jan 20 10:06:23 np0005588920 nova_compute[226886]: 2026-01-20 15:06:23.050 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:23 np0005588920 nova_compute[226886]: 2026-01-20 15:06:23.053 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:23.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:24.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:24 np0005588920 nova_compute[226886]: 2026-01-20 15:06:24.688 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.018 226890 DEBUG nova.compute.manager [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.019 226890 DEBUG oslo_concurrency.lockutils [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.020 226890 DEBUG oslo_concurrency.lockutils [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.020 226890 DEBUG oslo_concurrency.lockutils [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.020 226890 DEBUG nova.compute.manager [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:25 np0005588920 nova_compute[226886]: 2026-01-20 15:06:25.021 226890 WARNING nova.compute.manager [req-ff4c047d-807d-4211-a95a-86d64b115b45 req-839b7344-aee5-40e5-9d1a-f6cec583cf4a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:06:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:25.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:26.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:26 np0005588920 nova_compute[226886]: 2026-01-20 15:06:26.645 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:27.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:28.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:29.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:29 np0005588920 nova_compute[226886]: 2026-01-20 15:06:29.690 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:30.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:31.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:31 np0005588920 nova_compute[226886]: 2026-01-20 15:06:31.646 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:32.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:33.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:34.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:34 np0005588920 nova_compute[226886]: 2026-01-20 15:06:34.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:35.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:36.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:36 np0005588920 nova_compute[226886]: 2026-01-20 15:06:36.650 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:38.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:39.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:39 np0005588920 nova_compute[226886]: 2026-01-20 15:06:39.728 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:40 np0005588920 podman[288224]: 2026-01-20 15:06:40.002051671 +0000 UTC m=+0.082866447 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:06:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:40.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:41 np0005588920 nova_compute[226886]: 2026-01-20 15:06:41.651 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:41.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:42.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:43.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:44.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:44 np0005588920 nova_compute[226886]: 2026-01-20 15:06:44.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:45.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:46.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:46 np0005588920 nova_compute[226886]: 2026-01-20 15:06:46.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:06:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:47 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:06:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:47.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:48.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:49.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:49 np0005588920 nova_compute[226886]: 2026-01-20 15:06:49.732 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:50.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:50 np0005588920 podman[288383]: 2026-01-20 15:06:50.974972348 +0000 UTC m=+0.055630199 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.655 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:51.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.997 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.998 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.998 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:06:51 np0005588920 nova_compute[226886]: 2026-01-20 15:06:51.999 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:52.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:53 np0005588920 nova_compute[226886]: 2026-01-20 15:06:53.115 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updating instance_info_cache with network_info: [{"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:53 np0005588920 nova_compute[226886]: 2026-01-20 15:06:53.155 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-beaa17a1-aac9-450b-8036-a3b2b9e10bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:53 np0005588920 nova_compute[226886]: 2026-01-20 15:06:53.156 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:06:53 np0005588920 nova_compute[226886]: 2026-01-20 15:06:53.156 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:53.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:54.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:54 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:54 np0005588920 nova_compute[226886]: 2026-01-20 15:06:54.734 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:55.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.745 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.746 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:06:55 np0005588920 nova_compute[226886]: 2026-01-20 15:06:55.746 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:56 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1344781968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.210 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.285 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.285 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.285 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.481 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.482 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4018MB free_disk=20.825679779052734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.482 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.483 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.562 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance beaa17a1-aac9-450b-8036-a3b2b9e10bb3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.562 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.563 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:06:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:56.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.619 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:56 np0005588920 nova_compute[226886]: 2026-01-20 15:06:56.658 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/485878596' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:57 np0005588920 nova_compute[226886]: 2026-01-20 15:06:57.062 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:57 np0005588920 nova_compute[226886]: 2026-01-20 15:06:57.070 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:06:57 np0005588920 nova_compute[226886]: 2026-01-20 15:06:57.107 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:06:57 np0005588920 nova_compute[226886]: 2026-01-20 15:06:57.135 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:06:57 np0005588920 nova_compute[226886]: 2026-01-20 15:06:57.135 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:57.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:58.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.136 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.137 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.137 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.138 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.138 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.138 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:06:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:06:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:59.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:59 np0005588920 nova_compute[226886]: 2026-01-20 15:06:59.738 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:00.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:00 np0005588920 nova_compute[226886]: 2026-01-20 15:07:00.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:01.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:01 np0005588920 nova_compute[226886]: 2026-01-20 15:07:01.712 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:02.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:03.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:04.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:04 np0005588920 nova_compute[226886]: 2026-01-20 15:07:04.740 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:05.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:07:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:06.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:07:06 np0005588920 nova_compute[226886]: 2026-01-20 15:07:06.715 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:07.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:08.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:09.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:09 np0005588920 nova_compute[226886]: 2026-01-20 15:07:09.776 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:10.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:11 np0005588920 podman[288497]: 2026-01-20 15:07:11.020235067 +0000 UTC m=+0.108426887 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 20 10:07:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:11.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:11 np0005588920 nova_compute[226886]: 2026-01-20 15:07:11.717 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.488546) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632488606, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 798, "num_deletes": 251, "total_data_size": 1410236, "memory_usage": 1432384, "flush_reason": "Manual Compaction"}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632503779, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 930086, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59993, "largest_seqno": 60786, "table_properties": {"data_size": 926352, "index_size": 1514, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8947, "raw_average_key_size": 19, "raw_value_size": 918689, "raw_average_value_size": 2028, "num_data_blocks": 67, "num_entries": 453, "num_filter_entries": 453, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921581, "oldest_key_time": 1768921581, "file_creation_time": 1768921632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 15287 microseconds, and 3702 cpu microseconds.
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.503829) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 930086 bytes OK
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.503849) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505818) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505832) EVENT_LOG_v1 {"time_micros": 1768921632505828, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505849) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 1406019, prev total WAL file size 1406019, number of live WAL files 2.
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.506461) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(908KB)], [117(12MB)]
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632506524, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 13518817, "oldest_snapshot_seqno": -1}
Jan 20 10:07:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:12.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8533 keys, 11606073 bytes, temperature: kUnknown
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632650032, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 11606073, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11549668, "index_size": 33957, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21381, "raw_key_size": 222249, "raw_average_key_size": 26, "raw_value_size": 11398386, "raw_average_value_size": 1335, "num_data_blocks": 1323, "num_entries": 8533, "num_filter_entries": 8533, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.650293) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 11606073 bytes
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.654513) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 94.2 rd, 80.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(27.0) write-amplify(12.5) OK, records in: 9048, records dropped: 515 output_compression: NoCompression
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.654564) EVENT_LOG_v1 {"time_micros": 1768921632654536, "job": 74, "event": "compaction_finished", "compaction_time_micros": 143571, "compaction_time_cpu_micros": 26164, "output_level": 6, "num_output_files": 1, "total_output_size": 11606073, "num_input_records": 9048, "num_output_records": 8533, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632654877, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632656887, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.506379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656924) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656927) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 20 10:07:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:07:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2028028369' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:07:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:07:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2028028369' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:07:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:13.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 20 10:07:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:14.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:14 np0005588920 nova_compute[226886]: 2026-01-20 15:07:14.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:15.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:16.298 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:16.299 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:07:16 np0005588920 nova_compute[226886]: 2026-01-20 15:07:16.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:16.467 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:16.467 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:16.467 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:16.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:16 np0005588920 nova_compute[226886]: 2026-01-20 15:07:16.720 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:17.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:18.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:19.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:19 np0005588920 nova_compute[226886]: 2026-01-20 15:07:19.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:20.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 20 10:07:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:21.302 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:21.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:21 np0005588920 nova_compute[226886]: 2026-01-20 15:07:21.721 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:07:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/777705666' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:07:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:07:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/777705666' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:07:21 np0005588920 podman[288526]: 2026-01-20 15:07:21.967993277 +0000 UTC m=+0.051573323 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:07:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:22.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:23.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:24.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.790 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.791 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.791 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.791 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.792 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.793 226890 INFO nova.compute.manager [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Terminating instance#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.794 226890 DEBUG nova.compute.manager [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:07:24 np0005588920 kernel: tap69adac22-9a (unregistering): left promiscuous mode
Jan 20 10:07:24 np0005588920 NetworkManager[49076]: <info>  [1768921644.8726] device (tap69adac22-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:07:24Z|00772|binding|INFO|Releasing lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 from this chassis (sb_readonly=0)
Jan 20 10:07:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:07:24Z|00773|binding|INFO|Setting lport 69adac22-9a54-4fc6-a0ad-9775bd380e99 down in Southbound
Jan 20 10:07:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:07:24Z|00774|binding|INFO|Removing iface tap69adac22-9a ovn-installed in OVS
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.890 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:24.897 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:49:ce 10.100.0.11'], port_security=['fa:16:3e:c5:49:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'beaa17a1-aac9-450b-8036-a3b2b9e10bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=69adac22-9a54-4fc6-a0ad-9775bd380e99) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:24.898 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 69adac22-9a54-4fc6-a0ad-9775bd380e99 in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 unbound from our chassis#033[00m
Jan 20 10:07:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:24.899 144128 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:07:24.900 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d87898-9390-42d9-ac2f-cea4bedee2ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:24 np0005588920 nova_compute[226886]: 2026-01-20 15:07:24.916 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:24 np0005588920 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 20 10:07:24 np0005588920 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a3.scope: Consumed 15.184s CPU time.
Jan 20 10:07:24 np0005588920 systemd-machined[196121]: Machine qemu-78-instance-000000a3 terminated.
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.028 226890 INFO nova.virt.libvirt.driver [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Instance destroyed successfully.#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.028 226890 DEBUG nova.objects.instance [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'resources' on Instance uuid beaa17a1-aac9-450b-8036-a3b2b9e10bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.042 226890 DEBUG nova.virt.libvirt.vif [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-2068211489',display_name='tempest-ServerRescueTestJSON-server-2068211489',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-2068211489',id=163,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:06:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-5fu5wq70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:06:22Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=beaa17a1-aac9-450b-8036-a3b2b9e10bb3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.042 226890 DEBUG nova.network.os_vif_util [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "address": "fa:16:3e:c5:49:ce", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69adac22-9a", "ovs_interfaceid": "69adac22-9a54-4fc6-a0ad-9775bd380e99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.044 226890 DEBUG nova.network.os_vif_util [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.044 226890 DEBUG os_vif [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.048 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.049 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69adac22-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.052 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.055 226890 INFO os_vif [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:49:ce,bridge_name='br-int',has_traffic_filtering=True,id=69adac22-9a54-4fc6-a0ad-9775bd380e99,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69adac22-9a')#033[00m
Jan 20 10:07:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:25.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.873 226890 DEBUG nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.873 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.874 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.874 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.874 226890 DEBUG nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.874 226890 DEBUG nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-unplugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.875 226890 DEBUG nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.875 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.875 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.875 226890 DEBUG oslo_concurrency.lockutils [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.876 226890 DEBUG nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] No waiting events found dispatching network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:25 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.876 226890 WARNING nova.compute.manager [req-5de2a7da-8919-4085-8ba7-2371e992fe3d req-4a7ea650-6e8f-42c0-9974-b90ac05e1684 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received unexpected event network-vif-plugged-69adac22-9a54-4fc6-a0ad-9775bd380e99 for instance with vm_state rescued and task_state deleting.#033[00m
Jan 20 10:07:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:25.999 226890 INFO nova.virt.libvirt.driver [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Deleting instance files /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_del#033[00m
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.000 226890 INFO nova.virt.libvirt.driver [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Deletion of /var/lib/nova/instances/beaa17a1-aac9-450b-8036-a3b2b9e10bb3_del complete#033[00m
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.065 226890 INFO nova.compute.manager [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Took 1.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.066 226890 DEBUG oslo.service.loopingcall [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.066 226890 DEBUG nova.compute.manager [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.066 226890 DEBUG nova.network.neutron [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:07:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:26.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:26 np0005588920 nova_compute[226886]: 2026-01-20 15:07:26.723 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:27.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.802 226890 DEBUG nova.network.neutron [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.820 226890 INFO nova.compute.manager [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Took 1.75 seconds to deallocate network for instance.#033[00m
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.865 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.865 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.931 226890 DEBUG oslo_concurrency.processutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:27 np0005588920 nova_compute[226886]: 2026-01-20 15:07:27.970 226890 DEBUG nova.compute.manager [req-de2566bd-9fed-43bf-8a40-4b9e96148a11 req-f967cf2e-78f9-47c0-808b-5a5a2b98e8a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Received event network-vif-deleted-69adac22-9a54-4fc6-a0ad-9775bd380e99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/479322346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.357 226890 DEBUG oslo_concurrency.processutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.363 226890 DEBUG nova.compute.provider_tree [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.392 226890 DEBUG nova.scheduler.client.report [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.431 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.460 226890 INFO nova.scheduler.client.report [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Deleted allocations for instance beaa17a1-aac9-450b-8036-a3b2b9e10bb3#033[00m
Jan 20 10:07:28 np0005588920 nova_compute[226886]: 2026-01-20 15:07:28.526 226890 DEBUG oslo_concurrency.lockutils [None req-f22b9bef-93e2-459b-81cf-fccb6c9e248d a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "beaa17a1-aac9-450b-8036-a3b2b9e10bb3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:28.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:29.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:30 np0005588920 nova_compute[226886]: 2026-01-20 15:07:30.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:30.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:31.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:31 np0005588920 nova_compute[226886]: 2026-01-20 15:07:31.725 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:32.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:33 np0005588920 nova_compute[226886]: 2026-01-20 15:07:33.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:33.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:34.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:35 np0005588920 nova_compute[226886]: 2026-01-20 15:07:35.096 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:35.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:36.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:36 np0005588920 nova_compute[226886]: 2026-01-20 15:07:36.727 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:37.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:07:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:38.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:07:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:39.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:40 np0005588920 nova_compute[226886]: 2026-01-20 15:07:40.027 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921645.0255458, beaa17a1-aac9-450b-8036-a3b2b9e10bb3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:40 np0005588920 nova_compute[226886]: 2026-01-20 15:07:40.027 226890 INFO nova.compute.manager [-] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:07:40 np0005588920 nova_compute[226886]: 2026-01-20 15:07:40.054 226890 DEBUG nova.compute.manager [None req-cbbe7209-4620-4072-9730-0a8d50091614 - - - - - -] [instance: beaa17a1-aac9-450b-8036-a3b2b9e10bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:40 np0005588920 nova_compute[226886]: 2026-01-20 15:07:40.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:07:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:40.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:07:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:41 np0005588920 nova_compute[226886]: 2026-01-20 15:07:41.729 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:42 np0005588920 podman[288607]: 2026-01-20 15:07:42.004158271 +0000 UTC m=+0.090534476 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:07:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:42.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:43.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:44.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:45 np0005588920 nova_compute[226886]: 2026-01-20 15:07:45.104 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:45.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:46.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:46 np0005588920 nova_compute[226886]: 2026-01-20 15:07:46.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:47.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:48.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:49.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:50 np0005588920 nova_compute[226886]: 2026-01-20 15:07:50.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:51 np0005588920 nova_compute[226886]: 2026-01-20 15:07:51.732 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:07:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:51.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:07:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:07:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 12K writes, 61K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 12K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1704 writes, 8439 keys, 1704 commit groups, 1.0 writes per commit group, ingest: 16.78 MB, 0.03 MB/s#012Interval WAL: 1704 writes, 1704 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     79.0      0.93              0.28        37    0.025       0      0       0.0       0.0#012  L6      1/0   11.07 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.7    108.6     92.0      3.74              1.09        36    0.104    236K    19K       0.0       0.0#012 Sum      1/0   11.07 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.7     87.0     89.4      4.67              1.37        73    0.064    236K    19K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.7     86.7     88.0      0.91              0.22        12    0.076     53K   3158       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0    108.6     92.0      3.74              1.09        36    0.104    236K    19K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     79.1      0.93              0.28        36    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.072, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.41 GB write, 0.10 MB/s write, 0.40 GB read, 0.10 MB/s read, 4.7 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 46.40 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000365 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2664,44.64 MB,14.685%) FilterBlock(73,666.36 KB,0.21406%) IndexBlock(73,1.10 MB,0.363054%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:07:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:52.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:52 np0005588920 nova_compute[226886]: 2026-01-20 15:07:52.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:52 np0005588920 nova_compute[226886]: 2026-01-20 15:07:52.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:07:52 np0005588920 nova_compute[226886]: 2026-01-20 15:07:52.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:07:52 np0005588920 nova_compute[226886]: 2026-01-20 15:07:52.757 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:07:52 np0005588920 podman[288633]: 2026-01-20 15:07:52.956180283 +0000 UTC m=+0.045597653 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:07:53 np0005588920 nova_compute[226886]: 2026-01-20 15:07:53.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:53.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.240 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.241 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.276 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.409 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.410 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.417 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.417 226890 INFO nova.compute.claims [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.509 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:54.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1077560602' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.967 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.976 226890 DEBUG nova.compute.provider_tree [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:54 np0005588920 nova_compute[226886]: 2026-01-20 15:07:54.995 226890 DEBUG nova.scheduler.client.report [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.016 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.017 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.088 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.089 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.112 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.118 226890 INFO nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.142 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.246 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.247 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.248 226890 INFO nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Creating image(s)#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.269 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.294 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.320 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.324 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.389 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.390 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.391 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.391 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.416 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.420 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2a67c102-89d1-4196-bc8e-663656945547_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.448 226890 DEBUG nova.policy [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27658864f96d453586dd0846a4c55b7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc74c4a296554866969b05aef75252af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:07:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:07:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:07:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.722 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2a67c102-89d1-4196-bc8e-663656945547_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:55.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.780 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] resizing rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.861 226890 DEBUG nova.objects.instance [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.878 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.878 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Ensure instance console log exists: /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.878 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.879 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:55 np0005588920 nova_compute[226886]: 2026-01-20 15:07:55.879 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:56.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.734 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.750 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:07:56 np0005588920 nova_compute[226886]: 2026-01-20 15:07:56.750 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.049 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Successfully created port: 12d4a8f6-904d-4ec5-8062-530c89300b7c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:07:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1565062573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.166 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.332 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.333 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4195MB free_disk=20.986278533935547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.334 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.334 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.395 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 2a67c102-89d1-4196-bc8e-663656945547 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.396 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.396 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.434 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:57.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3322640165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.881 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.887 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.911 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.934 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:07:57 np0005588920 nova_compute[226886]: 2026-01-20 15:07:57.934 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.448 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Successfully updated port: 12d4a8f6-904d-4ec5-8062-530c89300b7c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.464 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.465 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.465 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.631 226890 DEBUG nova.compute.manager [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-changed-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.632 226890 DEBUG nova.compute.manager [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Refreshing instance network info cache due to event network-changed-12d4a8f6-904d-4ec5-8062-530c89300b7c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.632 226890 DEBUG oslo_concurrency.lockutils [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:58.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:58 np0005588920 nova_compute[226886]: 2026-01-20 15:07:58.726 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:07:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:07:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:59.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:59 np0005588920 nova_compute[226886]: 2026-01-20 15:07:59.933 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:59 np0005588920 nova_compute[226886]: 2026-01-20 15:07:59.934 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:59 np0005588920 nova_compute[226886]: 2026-01-20 15:07:59.934 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:59 np0005588920 nova_compute[226886]: 2026-01-20 15:07:59.934 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:59 np0005588920 nova_compute[226886]: 2026-01-20 15:07:59.935 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.116 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.199 226890 DEBUG nova.network.neutron [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.226 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.227 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance network_info: |[{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.227 226890 DEBUG oslo_concurrency.lockutils [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.227 226890 DEBUG nova.network.neutron [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Refreshing network info cache for port 12d4a8f6-904d-4ec5-8062-530c89300b7c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.230 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start _get_guest_xml network_info=[{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.233 226890 WARNING nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.237 226890 DEBUG nova.virt.libvirt.host [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.237 226890 DEBUG nova.virt.libvirt.host [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.251 226890 DEBUG nova.virt.libvirt.host [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.251 226890 DEBUG nova.virt.libvirt.host [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.253 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.253 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.253 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.254 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.254 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.254 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.255 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.255 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.255 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.256 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.256 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.256 226890 DEBUG nova.virt.hardware [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.259 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/537198121' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.687 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:00.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.711 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.715 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:00 np0005588920 nova_compute[226886]: 2026-01-20 15:08:00.741 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.904486) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680904516, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 806, "num_deletes": 253, "total_data_size": 1391325, "memory_usage": 1410168, "flush_reason": "Manual Compaction"}
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680914183, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 917779, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 60792, "largest_seqno": 61592, "table_properties": {"data_size": 913954, "index_size": 1605, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8198, "raw_average_key_size": 17, "raw_value_size": 906097, "raw_average_value_size": 1982, "num_data_blocks": 70, "num_entries": 457, "num_filter_entries": 457, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921633, "oldest_key_time": 1768921633, "file_creation_time": 1768921680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 9763 microseconds, and 4252 cpu microseconds.
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.914250) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 917779 bytes OK
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.914266) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916463) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916510) EVENT_LOG_v1 {"time_micros": 1768921680916500, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916535) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 1387086, prev total WAL file size 1387086, number of live WAL files 2.
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.917317) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323536' seq:72057594037927935, type:22 .. '6B7600353037' seq:0, type:0; will stop at (end)
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(896KB)], [120(11MB)]
Jan 20 10:08:00 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680917359, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 12523852, "oldest_snapshot_seqno": -1}
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8468 keys, 11460705 bytes, temperature: kUnknown
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681046895, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11460705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11404778, "index_size": 33671, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 222658, "raw_average_key_size": 26, "raw_value_size": 11254417, "raw_average_value_size": 1329, "num_data_blocks": 1293, "num_entries": 8468, "num_filter_entries": 8468, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.047120) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11460705 bytes
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.049902) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.6 rd, 88.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.1 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(26.1) write-amplify(12.5) OK, records in: 8990, records dropped: 522 output_compression: NoCompression
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.049956) EVENT_LOG_v1 {"time_micros": 1768921681049936, "job": 76, "event": "compaction_finished", "compaction_time_micros": 129598, "compaction_time_cpu_micros": 33536, "output_level": 6, "num_output_files": 1, "total_output_size": 11460705, "num_input_records": 8990, "num_output_records": 8468, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681050566, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681053519, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:00.917271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.053556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.053560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.053562) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.053564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:08:01.053566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3470223084' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.172 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.175 226890 DEBUG nova.virt.libvirt.vif [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1209466933',display_name='tempest-ServerRescueNegativeTestJSON-server-1209466933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1209466933',id=167,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-79z9txcr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:07:55Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=2a67c102-89d1-4196-bc8e-663656945547,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.176 226890 DEBUG nova.network.os_vif_util [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.177 226890 DEBUG nova.network.os_vif_util [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.179 226890 DEBUG nova.objects.instance [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.203 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <uuid>2a67c102-89d1-4196-bc8e-663656945547</uuid>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <name>instance-000000a7</name>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1209466933</nova:name>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:08:00</nova:creationTime>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <nova:port uuid="12d4a8f6-904d-4ec5-8062-530c89300b7c">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="serial">2a67c102-89d1-4196-bc8e-663656945547</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="uuid">2a67c102-89d1-4196-bc8e-663656945547</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2a67c102-89d1-4196-bc8e-663656945547_disk">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2a67c102-89d1-4196-bc8e-663656945547_disk.config">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:d4:df:63"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <target dev="tap12d4a8f6-90"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/console.log" append="off"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:08:01 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:08:01 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:08:01 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:08:01 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.205 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Preparing to wait for external event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.205 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.206 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.206 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.207 226890 DEBUG nova.virt.libvirt.vif [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1209466933',display_name='tempest-ServerRescueNegativeTestJSON-server-1209466933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1209466933',id=167,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-79z9txcr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:07:55Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=2a67c102-89d1-4196-bc8e-663656945547,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.207 226890 DEBUG nova.network.os_vif_util [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.208 226890 DEBUG nova.network.os_vif_util [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.208 226890 DEBUG os_vif [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.209 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.209 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.210 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.213 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12d4a8f6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.214 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12d4a8f6-90, col_values=(('external_ids', {'iface-id': '12d4a8f6-904d-4ec5-8062-530c89300b7c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:df:63', 'vm-uuid': '2a67c102-89d1-4196-bc8e-663656945547'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588920 NetworkManager[49076]: <info>  [1768921681.2160] manager: (tap12d4a8f6-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.221 226890 INFO os_vif [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90')#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.267 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.268 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.268 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:d4:df:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.268 226890 INFO nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Using config drive#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.288 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:08:01 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.736 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:01.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.794 226890 INFO nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Creating config drive at /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.798 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjota7jhy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.934 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjota7jhy" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.961 226890 DEBUG nova.storage.rbd_utils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:01 np0005588920 nova_compute[226886]: 2026-01-20 15:08:01.966 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config 2a67c102-89d1-4196-bc8e-663656945547_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.428 226890 DEBUG nova.network.neutron [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updated VIF entry in instance network info cache for port 12d4a8f6-904d-4ec5-8062-530c89300b7c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.429 226890 DEBUG nova.network.neutron [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.456 226890 DEBUG oslo_concurrency.lockutils [req-ca148a8c-3a1b-4a1a-8bc4-289d53e9bc2b req-7f9cc099-1991-4239-9305-0bf8c8383956 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:02.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.726 226890 DEBUG oslo_concurrency.processutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config 2a67c102-89d1-4196-bc8e-663656945547_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.760s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.726 226890 INFO nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Deleting local config drive /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config because it was imported into RBD.#033[00m
Jan 20 10:08:02 np0005588920 kernel: tap12d4a8f6-90: entered promiscuous mode
Jan 20 10:08:02 np0005588920 NetworkManager[49076]: <info>  [1768921682.7770] manager: (tap12d4a8f6-90): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Jan 20 10:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:02Z|00775|binding|INFO|Claiming lport 12d4a8f6-904d-4ec5-8062-530c89300b7c for this chassis.
Jan 20 10:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:02Z|00776|binding|INFO|12d4a8f6-904d-4ec5-8062-530c89300b7c: Claiming fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.776 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.783 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.793 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.795 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.796 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:08:02 np0005588920 systemd-machined[196121]: New machine qemu-79-instance-000000a7.
Jan 20 10:08:02 np0005588920 systemd-udevd[289201]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.809 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd9eb79-60da-4203-b477-a1569a1d2da9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.810 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3967ae21-11 in ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.812 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3967ae21-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.812 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd6423f-8269-4cfe-ba0a-3d3aea30a77f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.813 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[caa895d9-8958-4aed-ac7c-bb0c576b3368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 NetworkManager[49076]: <info>  [1768921682.8239] device (tap12d4a8f6-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:02 np0005588920 NetworkManager[49076]: <info>  [1768921682.8247] device (tap12d4a8f6-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.825 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[86fca3dc-e9c3-4703-97f6-16612197e7fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 systemd[1]: Started Virtual Machine qemu-79-instance-000000a7.
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:02Z|00777|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c ovn-installed in OVS
Jan 20 10:08:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:02Z|00778|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c up in Southbound
Jan 20 10:08:02 np0005588920 nova_compute[226886]: 2026-01-20 15:08:02.851 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.853 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e75ff9fc-f47e-42e7-b4c9-f349ae203a1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.882 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8f680712-acc0-4a74-82b8-71fe2f5c79fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.888 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8e0f24-46dd-490b-9738-b13cdc555db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 NetworkManager[49076]: <info>  [1768921682.8890] manager: (tap3967ae21-10): new Veth device (/org/freedesktop/NetworkManager/Devices/371)
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.915 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[520d413b-02d4-463d-8eb8-18a4ed38f2e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.918 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b03199-8c52-462c-b10f-8ab303adb03d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 NetworkManager[49076]: <info>  [1768921682.9388] device (tap3967ae21-10): carrier: link connected
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.942 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d97da1-eb83-4125-8b86-da708650ab45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.961 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[30f6bdfb-5982-4ef0-810f-d9d38b73c050]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672462, 'reachable_time': 31426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289233, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.977 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8df0c26a-76f0-426c-9496-7e46fa7f3bd9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:ce9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672462, 'tstamp': 672462}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289234, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:02 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:02.995 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ae12ce01-cc48-4899-8279-7397e4dba88b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672462, 'reachable_time': 31426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289235, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.024 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c393b9a5-155b-479f-9c22-640a0ea76799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.077 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8b06ac5c-97ba-40d1-8514-4c875196c490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.079 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.079 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.080 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:03 np0005588920 kernel: tap3967ae21-10: entered promiscuous mode
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.081 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588920 NetworkManager[49076]: <info>  [1768921683.0835] manager: (tap3967ae21-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.085 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.086 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:03Z|00779|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.088 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.089 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.089 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7dde93-d862-4dfc-87ad-0f82e0d9ec35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.090 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:03.091 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'env', 'PROCESS_TAG=haproxy-3967ae21-1590-4685-8881-8bd1bcf25258', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3967ae21-1590-4685-8881-8bd1bcf25258.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.206 226890 DEBUG nova.compute.manager [req-a577ed52-2b7b-458e-b17a-fdc31d6c9cef req-2402fc78-bfc5-4390-8f9e-180ed4128723 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.207 226890 DEBUG oslo_concurrency.lockutils [req-a577ed52-2b7b-458e-b17a-fdc31d6c9cef req-2402fc78-bfc5-4390-8f9e-180ed4128723 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.207 226890 DEBUG oslo_concurrency.lockutils [req-a577ed52-2b7b-458e-b17a-fdc31d6c9cef req-2402fc78-bfc5-4390-8f9e-180ed4128723 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.208 226890 DEBUG oslo_concurrency.lockutils [req-a577ed52-2b7b-458e-b17a-fdc31d6c9cef req-2402fc78-bfc5-4390-8f9e-180ed4128723 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.208 226890 DEBUG nova.compute.manager [req-a577ed52-2b7b-458e-b17a-fdc31d6c9cef req-2402fc78-bfc5-4390-8f9e-180ed4128723 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Processing event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:08:03 np0005588920 podman[289307]: 2026-01-20 15:08:03.442508701 +0000 UTC m=+0.043806691 container create f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.464 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921683.4631226, 2a67c102-89d1-4196-bc8e-663656945547 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.465 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:03 np0005588920 systemd[1]: Started libpod-conmon-f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a.scope.
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.467 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.470 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.472 226890 INFO nova.virt.libvirt.driver [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance spawned successfully.#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.473 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:08:03 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.484 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:03 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ebc0dfcd12a51ccbf3c5aa9eae21c413fdf6a7773e0114add55d2a6824e565/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.491 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.494 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.494 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.495 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.496 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 podman[289307]: 2026-01-20 15:08:03.496313027 +0000 UTC m=+0.097611037 container init f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.496 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.497 226890 DEBUG nova.virt.libvirt.driver [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:03 np0005588920 podman[289307]: 2026-01-20 15:08:03.501360932 +0000 UTC m=+0.102658922 container start f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:08:03 np0005588920 podman[289307]: 2026-01-20 15:08:03.419042862 +0000 UTC m=+0.020340872 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.519 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.520 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921683.4645472, 2a67c102-89d1-4196-bc8e-663656945547 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:03 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [NOTICE]   (289327) : New worker (289329) forked
Jan 20 10:08:03 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [NOTICE]   (289327) : Loading success.
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.520 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.557 226890 INFO nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Took 8.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.558 226890 DEBUG nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.559 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.566 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921683.4694026, 2a67c102-89d1-4196-bc8e-663656945547 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.566 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.599 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.602 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.616 226890 INFO nova.compute.manager [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Took 9.25 seconds to build instance.#033[00m
Jan 20 10:08:03 np0005588920 nova_compute[226886]: 2026-01-20 15:08:03.630 226890 DEBUG oslo_concurrency.lockutils [None req-25351e72-922b-4ea0-82a9-9363f8423e39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:03.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:04.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.283 226890 DEBUG nova.compute.manager [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.283 226890 DEBUG oslo_concurrency.lockutils [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.283 226890 DEBUG oslo_concurrency.lockutils [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.283 226890 DEBUG oslo_concurrency.lockutils [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.284 226890 DEBUG nova.compute.manager [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:05 np0005588920 nova_compute[226886]: 2026-01-20 15:08:05.284 226890 WARNING nova.compute.manager [req-cacf4b9a-11d1-4935-a86e-35c549cc2237 req-6d6cdbeb-8caf-4a0e-a37b-727f202f991b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:05.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:06 np0005588920 nova_compute[226886]: 2026-01-20 15:08:06.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:06.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:06 np0005588920 nova_compute[226886]: 2026-01-20 15:08:06.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:07.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:08.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:09.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:10.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:11 np0005588920 nova_compute[226886]: 2026-01-20 15:08:11.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:11.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:11 np0005588920 nova_compute[226886]: 2026-01-20 15:08:11.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:12.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:12 np0005588920 podman[289338]: 2026-01-20 15:08:12.987799995 +0000 UTC m=+0.075201608 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:08:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 20 10:08:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:13.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 20 10:08:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:14.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:15Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:08:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:15Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:08:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 20 10:08:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:15.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:16 np0005588920 nova_compute[226886]: 2026-01-20 15:08:16.220 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:16.468 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 20 10:08:16 np0005588920 nova_compute[226886]: 2026-01-20 15:08:16.901 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:17.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:18.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:19.575 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:19 np0005588920 nova_compute[226886]: 2026-01-20 15:08:19.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:19.576 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:08:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:19.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:20.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 20 10:08:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 20 10:08:21 np0005588920 nova_compute[226886]: 2026-01-20 15:08:21.223 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:21.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:21 np0005588920 nova_compute[226886]: 2026-01-20 15:08:21.905 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:22.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:08:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 60K writes, 242K keys, 60K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.06 MB/s#012Cumulative WAL: 60K writes, 22K syncs, 2.73 writes per sync, written: 0.24 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 41.10 MB, 0.07 MB/s#012Interval WAL: 10K writes, 4024 syncs, 2.57 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:08:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:23.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:23 np0005588920 podman[289369]: 2026-01-20 15:08:23.969093083 +0000 UTC m=+0.054686092 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:08:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:24.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:25.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:26 np0005588920 nova_compute[226886]: 2026-01-20 15:08:26.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:26.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:26 np0005588920 nova_compute[226886]: 2026-01-20 15:08:26.959 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:27.577 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:27.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:28.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:29.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.386 226890 INFO nova.compute.manager [None req-13d1cca5-0161-479e-aae8-5caf84387f6d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Pausing#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.387 226890 DEBUG nova.objects.instance [None req-13d1cca5-0161-479e-aae8-5caf84387f6d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.419 226890 DEBUG nova.compute.manager [None req-13d1cca5-0161-479e-aae8-5caf84387f6d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.419 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921710.4187279, 2a67c102-89d1-4196-bc8e-663656945547 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.419 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.447 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:30 np0005588920 nova_compute[226886]: 2026-01-20 15:08:30.451 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 20 10:08:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:30.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.514 226890 INFO nova.compute.manager [None req-2250e728-92e2-4173-a4e0-ca69cc039819 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Unpausing#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.515 226890 DEBUG nova.objects.instance [None req-2250e728-92e2-4173-a4e0-ca69cc039819 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.545 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921711.545125, 2a67c102-89d1-4196-bc8e-663656945547 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.545 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:31 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.549 226890 DEBUG nova.virt.libvirt.guest [None req-2250e728-92e2-4173-a4e0-ca69cc039819 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.550 226890 DEBUG nova.compute.manager [None req-2250e728-92e2-4173-a4e0-ca69cc039819 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.563 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.568 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.597 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 20 10:08:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:31.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:31 np0005588920 nova_compute[226886]: 2026-01-20 15:08:31.961 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:32.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:33.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:33 np0005588920 nova_compute[226886]: 2026-01-20 15:08:33.807 226890 INFO nova.compute.manager [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Rescuing#033[00m
Jan 20 10:08:33 np0005588920 nova_compute[226886]: 2026-01-20 15:08:33.807 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:33 np0005588920 nova_compute[226886]: 2026-01-20 15:08:33.808 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:33 np0005588920 nova_compute[226886]: 2026-01-20 15:08:33.808 226890 DEBUG nova.network.neutron [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:08:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:34.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:35 np0005588920 nova_compute[226886]: 2026-01-20 15:08:35.295 226890 DEBUG nova.network.neutron [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:35 np0005588920 nova_compute[226886]: 2026-01-20 15:08:35.315 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:35 np0005588920 nova_compute[226886]: 2026-01-20 15:08:35.626 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:08:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:36 np0005588920 nova_compute[226886]: 2026-01-20 15:08:36.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:36.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:36 np0005588920 nova_compute[226886]: 2026-01-20 15:08:36.962 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:37.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:37 np0005588920 kernel: tap12d4a8f6-90 (unregistering): left promiscuous mode
Jan 20 10:08:37 np0005588920 NetworkManager[49076]: <info>  [1768921717.9180] device (tap12d4a8f6-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:08:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:37Z|00780|binding|INFO|Releasing lport 12d4a8f6-904d-4ec5-8062-530c89300b7c from this chassis (sb_readonly=0)
Jan 20 10:08:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:37Z|00781|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c down in Southbound
Jan 20 10:08:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:37Z|00782|binding|INFO|Removing iface tap12d4a8f6-90 ovn-installed in OVS
Jan 20 10:08:37 np0005588920 nova_compute[226886]: 2026-01-20 15:08:37.929 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:37 np0005588920 nova_compute[226886]: 2026-01-20 15:08:37.931 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:37.938 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:37.939 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:08:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:37.941 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3967ae21-1590-4685-8881-8bd1bcf25258, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:08:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:37.942 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f49d4d65-2f16-486a-be89-d905648e5f10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:37.943 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace which is not needed anymore#033[00m
Jan 20 10:08:37 np0005588920 nova_compute[226886]: 2026-01-20 15:08:37.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:38 np0005588920 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 20 10:08:38 np0005588920 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a7.scope: Consumed 13.771s CPU time.
Jan 20 10:08:38 np0005588920 systemd-machined[196121]: Machine qemu-79-instance-000000a7 terminated.
Jan 20 10:08:38 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [NOTICE]   (289327) : haproxy version is 2.8.14-c23fe91
Jan 20 10:08:38 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [NOTICE]   (289327) : path to executable is /usr/sbin/haproxy
Jan 20 10:08:38 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [WARNING]  (289327) : Exiting Master process...
Jan 20 10:08:38 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [ALERT]    (289327) : Current worker (289329) exited with code 143 (Terminated)
Jan 20 10:08:38 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289323]: [WARNING]  (289327) : All workers exited. Exiting... (0)
Jan 20 10:08:38 np0005588920 systemd[1]: libpod-f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a.scope: Deactivated successfully.
Jan 20 10:08:38 np0005588920 podman[289412]: 2026-01-20 15:08:38.140628197 +0000 UTC m=+0.106182473 container died f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:08:38 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a-userdata-shm.mount: Deactivated successfully.
Jan 20 10:08:38 np0005588920 systemd[1]: var-lib-containers-storage-overlay-20ebc0dfcd12a51ccbf3c5aa9eae21c413fdf6a7773e0114add55d2a6824e565-merged.mount: Deactivated successfully.
Jan 20 10:08:38 np0005588920 podman[289412]: 2026-01-20 15:08:38.179087365 +0000 UTC m=+0.144641631 container cleanup f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:08:38 np0005588920 systemd[1]: libpod-conmon-f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a.scope: Deactivated successfully.
Jan 20 10:08:38 np0005588920 podman[289451]: 2026-01-20 15:08:38.246229021 +0000 UTC m=+0.040876378 container remove f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.251 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d01eb0ee-5acb-4508-8c68-65b01d204a21]: (4, ('Tue Jan 20 03:08:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a)\nf782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a\nTue Jan 20 03:08:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (f782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a)\nf782aa9300d3d2344a9c219a960afbff57ed88876441d70622129f717544c10a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.254 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[931bfe35-8ac8-46c2-b92c-ca70099a7012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.254 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:38 np0005588920 kernel: tap3967ae21-10: left promiscuous mode
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.275 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8984ea2e-3a7e-49f7-a133-3966ca92f6e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.289 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bdac5666-4410-4c9a-8324-c06a2cba4e92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.290 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3d7d45-5ceb-47d9-92ee-0367b1d028a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.305 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c859a9cb-157e-41b9-8e1a-2c3f34a93763]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672456, 'reachable_time': 20276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289471, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.307 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:08:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:38.307 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[670b2d6e-1dd5-4200-a175-2da3aa633e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:38 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3967ae21\x2d1590\x2d4685\x2d8881\x2d8bd1bcf25258.mount: Deactivated successfully.
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.547 226890 DEBUG nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.548 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.548 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.549 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.549 226890 DEBUG nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.549 226890 WARNING nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.549 226890 DEBUG nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.550 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.550 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.550 226890 DEBUG oslo_concurrency.lockutils [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.551 226890 DEBUG nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.551 226890 WARNING nova.compute.manager [req-dbd2d613-abe3-4b2d-b6fe-f4729b843707 req-e9bd8d3b-d780-4ae6-b2a2-86f236e53be9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.640 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.644 226890 INFO nova.virt.libvirt.driver [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance destroyed successfully.#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.645 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'numa_topology' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.664 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Attempting rescue#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.664 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.667 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.668 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Creating image(s)#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.740 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.743 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.779 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.802 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.806 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.869 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.870 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.871 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.871 226890 DEBUG oslo_concurrency.lockutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.897 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:38 np0005588920 nova_compute[226886]: 2026-01-20 15:08:38.900 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.283 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2a67c102-89d1-4196-bc8e-663656945547_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.383s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.285 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.300 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.301 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start _get_guest_xml network_info=[{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:d4:df:63"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.301 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.317 226890 WARNING nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.324 226890 DEBUG nova.virt.libvirt.host [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.324 226890 DEBUG nova.virt.libvirt.host [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.327 226890 DEBUG nova.virt.libvirt.host [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.328 226890 DEBUG nova.virt.libvirt.host [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.329 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.329 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.330 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.330 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.330 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.330 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.330 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.331 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.331 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.331 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.331 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.331 226890 DEBUG nova.virt.hardware [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.332 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.371 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:39.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:39 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2722338201' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.849 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:39 np0005588920 nova_compute[226886]: 2026-01-20 15:08:39.851 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:40 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2140542339' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.371 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.372 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:40.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:40 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3093963373' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.812 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.815 226890 DEBUG nova.virt.libvirt.vif [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1209466933',display_name='tempest-ServerRescueNegativeTestJSON-server-1209466933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1209466933',id=167,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-79z9txcr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:08:31Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=2a67c102-89d1-4196-bc8e-663656945547,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:d4:df:63"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.815 226890 DEBUG nova.network.os_vif_util [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:d4:df:63"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.816 226890 DEBUG nova.network.os_vif_util [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.818 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.851 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <uuid>2a67c102-89d1-4196-bc8e-663656945547</uuid>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <name>instance-000000a7</name>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1209466933</nova:name>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:08:39</nova:creationTime>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <nova:port uuid="12d4a8f6-904d-4ec5-8062-530c89300b7c">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="serial">2a67c102-89d1-4196-bc8e-663656945547</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="uuid">2a67c102-89d1-4196-bc8e-663656945547</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2a67c102-89d1-4196-bc8e-663656945547_disk.rescue">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2a67c102-89d1-4196-bc8e-663656945547_disk">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/2a67c102-89d1-4196-bc8e-663656945547_disk.config.rescue">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:d4:df:63"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <target dev="tap12d4a8f6-90"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/console.log" append="off"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:08:40 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:08:40 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:08:40 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:08:40 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.859 226890 INFO nova.virt.libvirt.driver [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance destroyed successfully.#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.922 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.922 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.923 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.923 226890 DEBUG nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:d4:df:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.923 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Using config drive#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.943 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:40 np0005588920 nova_compute[226886]: 2026-01-20 15:08:40.965 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:41 np0005588920 nova_compute[226886]: 2026-01-20 15:08:41.002 226890 DEBUG nova.objects.instance [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'keypairs' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:41 np0005588920 nova_compute[226886]: 2026-01-20 15:08:41.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:41.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:41 np0005588920 nova_compute[226886]: 2026-01-20 15:08:41.889 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Creating config drive at /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue#033[00m
Jan 20 10:08:41 np0005588920 nova_compute[226886]: 2026-01-20 15:08:41.894 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfkfcf27c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:41 np0005588920 nova_compute[226886]: 2026-01-20 15:08:41.965 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.026 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfkfcf27c" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.052 226890 DEBUG nova.storage.rbd_utils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image 2a67c102-89d1-4196-bc8e-663656945547_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.056 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue 2a67c102-89d1-4196-bc8e-663656945547_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.217 226890 DEBUG oslo_concurrency.processutils [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue 2a67c102-89d1-4196-bc8e-663656945547_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.218 226890 INFO nova.virt.libvirt.driver [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Deleting local config drive /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:08:42 np0005588920 kernel: tap12d4a8f6-90: entered promiscuous mode
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.2707] manager: (tap12d4a8f6-90): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Jan 20 10:08:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:42Z|00783|binding|INFO|Claiming lport 12d4a8f6-904d-4ec5-8062-530c89300b7c for this chassis.
Jan 20 10:08:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:42Z|00784|binding|INFO|12d4a8f6-904d-4ec5-8062-530c89300b7c: Claiming fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.323 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.324 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.325 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:08:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:42Z|00785|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c ovn-installed in OVS
Jan 20 10:08:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:42Z|00786|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c up in Southbound
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.341 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b430ad48-2bf1-4afc-83f0-9edd5fb25aca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.342 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3967ae21-11 in ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.345 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3967ae21-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:42 np0005588920 systemd-machined[196121]: New machine qemu-80-instance-000000a7.
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.345 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[349bcfba-9186-4270-b485-174c848c1dbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.347 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5de66f9b-69b0-4cc4-afd7-f1ddd28d5dd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 systemd-udevd[289705]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:42 np0005588920 systemd[1]: Started Virtual Machine qemu-80-instance-000000a7.
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.363 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cde2bf-3454-4323-abe1-0a891525b54d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.3649] device (tap12d4a8f6-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.3661] device (tap12d4a8f6-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.393 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60d2d311-ffdf-48a8-a822-d3a3b4c25cbd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.430 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e6cd63-55b9-4a69-9197-3bc8d4e8d1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.4366] manager: (tap3967ae21-10): new Veth device (/org/freedesktop/NetworkManager/Devices/374)
Jan 20 10:08:42 np0005588920 systemd-udevd[289708]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.435 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e69e1ed-4249-4541-8dc4-2401f27c68d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.477 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[045801b0-9f61-4dc6-9ed3-f97d08b9ef49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.479 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[13632f2e-6be9-4ab9-bbc1-5def3bf6ce35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.5037] device (tap3967ae21-10): carrier: link connected
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.509 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b39ac90e-6cf7-44d8-b777-1de0f9bfe2b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.528 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb85004-7c41-4a13-a0ac-d76dd95ae6ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676418, 'reachable_time': 37318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289737, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.545 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7b297b-16f2-458a-98c8-6c322f5f65f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:ce9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 676418, 'tstamp': 676418}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289738, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.562 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1314512b-db69-4de9-ac00-ea7f3e296e95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676418, 'reachable_time': 37318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289739, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.594 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[16962982-4c4d-4947-aa51-8b70a92b8fe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.676 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48134d4c-6e46-407a-8662-80cd7ede2dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.677 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.678 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:42 np0005588920 kernel: tap3967ae21-10: entered promiscuous mode
Jan 20 10:08:42 np0005588920 NetworkManager[49076]: <info>  [1768921722.6802] manager: (tap3967ae21-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.679 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.683 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.683 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.684 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:42Z|00787|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.699 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.700 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.701 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[020916fa-1eb9-41e9-9b8d-1821385ed807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.702 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:42.703 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'env', 'PROCESS_TAG=haproxy-3967ae21-1590-4685-8881-8bd1bcf25258', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3967ae21-1590-4685-8881-8bd1bcf25258.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:42.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.962 226890 DEBUG nova.compute.manager [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.962 226890 DEBUG oslo_concurrency.lockutils [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.962 226890 DEBUG oslo_concurrency.lockutils [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.963 226890 DEBUG oslo_concurrency.lockutils [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.963 226890 DEBUG nova.compute.manager [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:42 np0005588920 nova_compute[226886]: 2026-01-20 15:08:42.963 226890 WARNING nova.compute.manager [req-ea5a8483-f12b-48a9-83e6-b72e160a3fdc req-dd225d86-1663-486f-b3d7-f8a2a1549d1b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:08:43 np0005588920 podman[289830]: 2026-01-20 15:08:43.083559283 +0000 UTC m=+0.048965278 container create 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.095 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 2a67c102-89d1-4196-bc8e-663656945547 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.096 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921723.0938656, 2a67c102-89d1-4196-bc8e-663656945547 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.096 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.102 226890 DEBUG nova.compute.manager [None req-e7f34435-7335-4ace-aea5-2cf6bfec62b3 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:43 np0005588920 systemd[1]: Started libpod-conmon-1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7.scope.
Jan 20 10:08:43 np0005588920 podman[289830]: 2026-01-20 15:08:43.05644044 +0000 UTC m=+0.021846455 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.155 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:43 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.159 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:43 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aabf5413fb6855246432b6bec9f171a63596d84218bbd83c9b0f21304c28bfc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:43 np0005588920 podman[289830]: 2026-01-20 15:08:43.17451414 +0000 UTC m=+0.139920185 container init 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:08:43 np0005588920 podman[289830]: 2026-01-20 15:08:43.181242482 +0000 UTC m=+0.146648487 container start 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:08:43 np0005588920 podman[289845]: 2026-01-20 15:08:43.19657436 +0000 UTC m=+0.087549161 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:08:43 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [NOTICE]   (289871) : New worker (289876) forked
Jan 20 10:08:43 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [NOTICE]   (289871) : Loading success.
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.222 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.223 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921723.0955715, 2a67c102-89d1-4196-bc8e-663656945547 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.223 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.251 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:43 np0005588920 nova_compute[226886]: 2026-01-20 15:08:43.255 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:43.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:44.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:45.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.237 226890 DEBUG nova.compute.manager [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.237 226890 DEBUG oslo_concurrency.lockutils [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.238 226890 DEBUG oslo_concurrency.lockutils [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.238 226890 DEBUG oslo_concurrency.lockutils [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.238 226890 DEBUG nova.compute.manager [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.239 226890 WARNING nova.compute.manager [req-f1461e39-a634-4af3-ae93-cc016e3aa4c3 req-147e1f86-386f-4197-bfdf-82bdfc9c329a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.239 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.274 226890 INFO nova.compute.manager [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Unrescuing#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.274 226890 DEBUG oslo_concurrency.lockutils [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.274 226890 DEBUG oslo_concurrency.lockutils [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.275 226890 DEBUG nova.network.neutron [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:08:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:46.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:46 np0005588920 nova_compute[226886]: 2026-01-20 15:08:46.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:47.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:48.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:49.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2585228950' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:49 np0005588920 nova_compute[226886]: 2026-01-20 15:08:49.899 226890 DEBUG nova.network.neutron [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:49 np0005588920 nova_compute[226886]: 2026-01-20 15:08:49.934 226890 DEBUG oslo_concurrency.lockutils [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:49 np0005588920 nova_compute[226886]: 2026-01-20 15:08:49.935 226890 DEBUG nova.objects.instance [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:50 np0005588920 kernel: tap12d4a8f6-90 (unregistering): left promiscuous mode
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.0090] device (tap12d4a8f6-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00788|binding|INFO|Releasing lport 12d4a8f6-904d-4ec5-8062-530c89300b7c from this chassis (sb_readonly=0)
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00789|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c down in Southbound
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.017 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00790|binding|INFO|Removing iface tap12d4a8f6-90 ovn-installed in OVS
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.023 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.024 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.025 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3967ae21-1590-4685-8881-8bd1bcf25258, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.026 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0523d581-08a3-4709-adb1-c53740c89dfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.026 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace which is not needed anymore#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.039 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 20 10:08:50 np0005588920 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a7.scope: Consumed 7.711s CPU time.
Jan 20 10:08:50 np0005588920 systemd-machined[196121]: Machine qemu-80-instance-000000a7 terminated.
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.178 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.182 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.193 226890 INFO nova.virt.libvirt.driver [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance destroyed successfully.#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.194 226890 DEBUG nova.objects.instance [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'numa_topology' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.323 226890 DEBUG nova.compute.manager [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.324 226890 DEBUG oslo_concurrency.lockutils [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.325 226890 DEBUG oslo_concurrency.lockutils [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.325 226890 DEBUG oslo_concurrency.lockutils [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.325 226890 DEBUG nova.compute.manager [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.325 226890 WARNING nova.compute.manager [req-f99b8f86-386a-46a2-b278-5e9e021123dc req-aba26b1a-9150-4292-b03c-8aa1cd5ed256 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [NOTICE]   (289871) : haproxy version is 2.8.14-c23fe91
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [NOTICE]   (289871) : path to executable is /usr/sbin/haproxy
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [WARNING]  (289871) : Exiting Master process...
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [WARNING]  (289871) : Exiting Master process...
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [ALERT]    (289871) : Current worker (289876) exited with code 143 (Terminated)
Jan 20 10:08:50 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[289857]: [WARNING]  (289871) : All workers exited. Exiting... (0)
Jan 20 10:08:50 np0005588920 systemd[1]: libpod-1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7.scope: Deactivated successfully.
Jan 20 10:08:50 np0005588920 podman[289909]: 2026-01-20 15:08:50.421644877 +0000 UTC m=+0.313134059 container died 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:08:50 np0005588920 kernel: tap12d4a8f6-90: entered promiscuous mode
Jan 20 10:08:50 np0005588920 systemd-udevd[289890]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.4577] manager: (tap12d4a8f6-90): new Tun device (/org/freedesktop/NetworkManager/Devices/376)
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00791|binding|INFO|Claiming lport 12d4a8f6-904d-4ec5-8062-530c89300b7c for this chassis.
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00792|binding|INFO|12d4a8f6-904d-4ec5-8062-530c89300b7c: Claiming fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.457 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.466 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.4705] device (tap12d4a8f6-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.4713] device (tap12d4a8f6-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00793|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c ovn-installed in OVS
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00794|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c up in Southbound
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.478 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.482 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:08:50 np0005588920 systemd[1]: var-lib-containers-storage-overlay-aabf5413fb6855246432b6bec9f171a63596d84218bbd83c9b0f21304c28bfc3-merged.mount: Deactivated successfully.
Jan 20 10:08:50 np0005588920 systemd-machined[196121]: New machine qemu-81-instance-000000a7.
Jan 20 10:08:50 np0005588920 podman[289909]: 2026-01-20 15:08:50.49984747 +0000 UTC m=+0.391336652 container cleanup 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:08:50 np0005588920 systemd[1]: Started Virtual Machine qemu-81-instance-000000a7.
Jan 20 10:08:50 np0005588920 systemd[1]: libpod-conmon-1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7.scope: Deactivated successfully.
Jan 20 10:08:50 np0005588920 podman[289965]: 2026-01-20 15:08:50.572692899 +0000 UTC m=+0.050240025 container remove 1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.577 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1e636663-3fa5-4dd4-862f-85a3c9a78d2c]: (4, ('Tue Jan 20 03:08:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7)\n1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7\nTue Jan 20 03:08:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7)\n1037123ffe6a0f6f62940c81d9a4984fa8481a6c6ccf6052d4c68f87881647d7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.579 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6769893e-5870-4868-9508-759068d98ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.580 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 kernel: tap3967ae21-10: left promiscuous mode
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.599 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44de54b5-db0d-418a-afcc-b6560d81aa43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.612 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3f9fdf-8bb8-495c-aeaf-dc7936b2418f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.613 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d7443a15-cb25-47b4-9cd6-e36eeb0f7334]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.629 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[70ecacb0-a4de-46f6-8649-faf8543364d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676410, 'reachable_time': 15285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289986, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3967ae21\x2d1590\x2d4685\x2d8881\x2d8bd1bcf25258.mount: Deactivated successfully.
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.633 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.633 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[9af57afa-d68c-4fcb-a123-50e8d7b7ae7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.633 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.635 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.648 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[951ca7d1-12ba-4404-906d-92b90dbefad5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.649 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3967ae21-11 in ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.650 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3967ae21-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.651 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[afa37c3d-97b3-4246-904d-2fbf595ad708]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.651 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6385b4-7273-4e2f-a706-cb9dc3f396af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.662 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[0582e67d-7e25-4ff5-9a15-bc76cf38f5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.688 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0bcbe2d5-2927-4835-8656-6d4b4d48f528]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.718 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[663e1187-717a-4021-9fe5-4ee20ff78cd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.723 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2081ab2-269c-4b8f-a47a-4c1148cd76d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.7249] manager: (tap3967ae21-10): new Veth device (/org/freedesktop/NetworkManager/Devices/377)
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.758 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[350a8e9b-be4f-4240-bd61-825268fd7b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.761 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[504c815b-adc9-4d73-9eb3-eb25cafa1916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:50.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.7867] device (tap3967ae21-10): carrier: link connected
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.790 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c90075ab-5822-4a45-b71a-313efabacd16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.806 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[96528f3b-f886-4094-a405-bd73838b54b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290011, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.822 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e2de68af-38b5-44a6-8504-5d105d5d6e50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:ce9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677247, 'tstamp': 677247}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290012, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.835 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb7a890-8491-4266-9f88-918b2b0ec043]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 290013, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.863 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4444f295-b87a-4309-97ba-2b071ed9e16e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.923 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d27c06-9824-486e-8b5f-99e06f7325d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.924 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.924 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.924 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.926 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 kernel: tap3967ae21-10: entered promiscuous mode
Jan 20 10:08:50 np0005588920 NetworkManager[49076]: <info>  [1768921730.9269] manager: (tap3967ae21-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.929 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 ovn_controller[133971]: 2026-01-20T15:08:50Z|00795|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.931 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.931 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.932 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4b46577f-067a-46c8-81dc-9d31ed6893c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.932 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:08:50.933 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'env', 'PROCESS_TAG=haproxy-3967ae21-1590-4685-8881-8bd1bcf25258', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3967ae21-1590-4685-8881-8bd1bcf25258.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:50 np0005588920 nova_compute[226886]: 2026-01-20 15:08:50.943 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:51 np0005588920 podman[290085]: 2026-01-20 15:08:51.340144188 +0000 UTC m=+0.110066503 container create 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.349 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for 2a67c102-89d1-4196-bc8e-663656945547 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.349 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921731.3492503, 2a67c102-89d1-4196-bc8e-663656945547 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.350 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.383 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.389 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:51 np0005588920 systemd[1]: Started libpod-conmon-7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d.scope.
Jan 20 10:08:51 np0005588920 podman[290085]: 2026-01-20 15:08:51.30661178 +0000 UTC m=+0.076534115 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.413 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.413 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921731.3513167, 2a67c102-89d1-4196-bc8e-663656945547 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.413 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:51 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:08:51 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8875d7849dc86efef31f11884c65956a5d5b32fb9ed08c052b43ce30db2f1d38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.437 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.439 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:51 np0005588920 podman[290085]: 2026-01-20 15:08:51.453108043 +0000 UTC m=+0.223030378 container init 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.459 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:08:51 np0005588920 podman[290085]: 2026-01-20 15:08:51.460034001 +0000 UTC m=+0.229956316 container start 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:08:51 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [NOTICE]   (290124) : New worker (290126) forked
Jan 20 10:08:51 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [NOTICE]   (290124) : Loading success.
Jan 20 10:08:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:51.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:51 np0005588920 nova_compute[226886]: 2026-01-20 15:08:51.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.056 226890 DEBUG nova.compute.manager [None req-61a4d240-7edf-44e9-bf2b-df61551d9eef 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.427 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.427 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.428 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.428 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.429 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.429 226890 WARNING nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.429 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.430 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.430 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.430 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.431 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.431 226890 WARNING nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.432 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.432 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.433 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.433 226890 DEBUG oslo_concurrency.lockutils [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.434 226890 DEBUG nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.435 226890 WARNING nova.compute.manager [req-28f0de75-9196-4a69-9311-f2fe2f3d69db req-5d94ca81-1a1d-49cb-abf7-7991cd49e362 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:08:52 np0005588920 nova_compute[226886]: 2026-01-20 15:08:52.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:08:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:52.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:53 np0005588920 nova_compute[226886]: 2026-01-20 15:08:53.121 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:53 np0005588920 nova_compute[226886]: 2026-01-20 15:08:53.121 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:53 np0005588920 nova_compute[226886]: 2026-01-20 15:08:53.121 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:08:53 np0005588920 nova_compute[226886]: 2026-01-20 15:08:53.121 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:54 np0005588920 nova_compute[226886]: 2026-01-20 15:08:54.597 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:54 np0005588920 nova_compute[226886]: 2026-01-20 15:08:54.611 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:54 np0005588920 nova_compute[226886]: 2026-01-20 15:08:54.611 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:08:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:54.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:54 np0005588920 podman[290136]: 2026-01-20 15:08:54.961932782 +0000 UTC m=+0.050675657 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:08:55 np0005588920 nova_compute[226886]: 2026-01-20 15:08:55.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:55.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:56 np0005588920 nova_compute[226886]: 2026-01-20 15:08:56.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:56 np0005588920 nova_compute[226886]: 2026-01-20 15:08:56.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.750 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:08:57 np0005588920 nova_compute[226886]: 2026-01-20 15:08:57.750 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:57.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:08:58 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4045448645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.493 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.743s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.572 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.572 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.713 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.715 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3986MB free_disk=20.831497192382812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.715 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.715 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:08:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.873 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 2a67c102-89d1-4196-bc8e-663656945547 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.873 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.874 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:08:58 np0005588920 nova_compute[226886]: 2026-01-20 15:08:58.942 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.002 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.003 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.026 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.049 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.102 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:08:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2842859879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.574 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.579 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.593 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.619 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:08:59 np0005588920 nova_compute[226886]: 2026-01-20 15:08:59.619 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:08:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:59.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.622 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.623 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.724 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:09:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:00.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.796 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.796 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.814 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.894 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.894 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.901 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:09:00 np0005588920 nova_compute[226886]: 2026-01-20 15:09:00.902 226890 INFO nova.compute.claims [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.028 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3777703864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.515 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.521 226890 DEBUG nova.compute.provider_tree [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.550 226890 DEBUG nova.scheduler.client.report [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.608 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.609 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.664 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.665 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.689 226890 INFO nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.714 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.814 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.816 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.816 226890 INFO nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Creating image(s)#033[00m
Jan 20 10:09:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:01.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.842 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.866 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.895 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.898 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:01 np0005588920 nova_compute[226886]: 2026-01-20 15:09:01.940 226890 DEBUG nova.policy [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27658864f96d453586dd0846a4c55b7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc74c4a296554866969b05aef75252af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.027 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.029 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.030 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.030 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.031 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.056 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.060 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.612 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.677 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] resizing rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.745 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:02.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.939 226890 DEBUG nova.objects.instance [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.962 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.963 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Ensure instance console log exists: /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.964 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.964 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:02 np0005588920 nova_compute[226886]: 2026-01-20 15:09:02.964 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:02 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:03 np0005588920 nova_compute[226886]: 2026-01-20 15:09:03.101 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Successfully created port: 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:09:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:03.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:03 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:03Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:df:63 10.100.0.14
Jan 20 10:09:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:09:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.111 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Successfully updated port: 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.127 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.128 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.128 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.242 226890 DEBUG nova.compute.manager [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.242 226890 DEBUG nova.compute.manager [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing instance network info cache due to event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.243 226890 DEBUG oslo_concurrency.lockutils [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 20 10:09:04 np0005588920 nova_compute[226886]: 2026-01-20 15:09:04.701 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:09:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:04.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2658081429' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2658081429' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.512 226890 DEBUG nova.network.neutron [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.530 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.530 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance network_info: |[{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.531 226890 DEBUG oslo_concurrency.lockutils [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.531 226890 DEBUG nova.network.neutron [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.534 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start _get_guest_xml network_info=[{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.538 226890 WARNING nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.543 226890 DEBUG nova.virt.libvirt.host [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.543 226890 DEBUG nova.virt.libvirt.host [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.548 226890 DEBUG nova.virt.libvirt.host [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.548 226890 DEBUG nova.virt.libvirt.host [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.550 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.551 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.551 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.551 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.552 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.552 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.552 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.553 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.553 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.553 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.553 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.553 226890 DEBUG nova.virt.hardware [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:09:05 np0005588920 nova_compute[226886]: 2026-01-20 15:09:05.556 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 20 10:09:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:05.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3460090808' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.013 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.043 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.046 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/330773330' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.492 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.494 226890 DEBUG nova.virt.libvirt.vif [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1321021662',display_name='tempest-ServerRescueNegativeTestJSON-server-1321021662',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1321021662',id=171,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/7uXKxe9nAPl14X3fTp2ccJ5Be2YZUIKmP54MYpo0vFkLM4vJyo+K5ySbdF/GxgqpIyyKbMYSgP6x/brvrQahBSInMKWnh7cc52EbXbOHcGpmF2QhgzpimmyzN8oX4hw==',key_name='tempest-keypair-1299086936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-4rgzfbcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:09:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='27658864f96d453586dd0846a4c55b7d',uuid=f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.494 226890 DEBUG nova.network.os_vif_util [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.495 226890 DEBUG nova.network.os_vif_util [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.497 226890 DEBUG nova.objects.instance [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.508 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <uuid>f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</uuid>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <name>instance-000000ab</name>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1321021662</nova:name>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:09:05</nova:creationTime>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <nova:port uuid="5c6391be-1db0-417c-a94e-89eb0cdbd8e7">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="serial">f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="uuid">f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:9d:e5:5b"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <target dev="tap5c6391be-1d"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/console.log" append="off"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:09:06 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:09:06 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:09:06 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:09:06 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.510 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Preparing to wait for external event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.511 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.512 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.512 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.514 226890 DEBUG nova.virt.libvirt.vif [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1321021662',display_name='tempest-ServerRescueNegativeTestJSON-server-1321021662',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1321021662',id=171,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/7uXKxe9nAPl14X3fTp2ccJ5Be2YZUIKmP54MYpo0vFkLM4vJyo+K5ySbdF/GxgqpIyyKbMYSgP6x/brvrQahBSInMKWnh7cc52EbXbOHcGpmF2QhgzpimmyzN8oX4hw==',key_name='tempest-keypair-1299086936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-4rgzfbcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:09:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='27658864f96d453586dd0846a4c55b7d',uuid=f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.514 226890 DEBUG nova.network.os_vif_util [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.516 226890 DEBUG nova.network.os_vif_util [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.516 226890 DEBUG os_vif [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.519 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.519 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.524 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.525 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c6391be-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.526 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c6391be-1d, col_values=(('external_ids', {'iface-id': '5c6391be-1db0-417c-a94e-89eb0cdbd8e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:e5:5b', 'vm-uuid': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588920 NetworkManager[49076]: <info>  [1768921746.5297] manager: (tap5c6391be-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.531 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.538 226890 INFO os_vif [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d')#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.599 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.599 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.600 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:9d:e5:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.600 226890 INFO nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Using config drive#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.619 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.740 226890 DEBUG nova.network.neutron [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updated VIF entry in instance network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.740 226890 DEBUG nova.network.neutron [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:06 np0005588920 nova_compute[226886]: 2026-01-20 15:09:06.764 226890 DEBUG oslo_concurrency.lockutils [req-90f9d67a-bd45-4c00-bbc8-31da84c78ee4 req-751923fa-7c16-4816-ae16-a9db89033b90 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:06.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3862902750' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:09:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3862902750' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.027 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.058 226890 INFO nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Creating config drive at /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.063 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2qbiovsg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.198 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2qbiovsg" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.227 226890 DEBUG nova.storage.rbd_utils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.231 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.403 226890 DEBUG oslo_concurrency.processutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.404 226890 INFO nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Deleting local config drive /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config because it was imported into RBD.#033[00m
Jan 20 10:09:07 np0005588920 kernel: tap5c6391be-1d: entered promiscuous mode
Jan 20 10:09:07 np0005588920 NetworkManager[49076]: <info>  [1768921747.4471] manager: (tap5c6391be-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/380)
Jan 20 10:09:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:07Z|00796|binding|INFO|Claiming lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for this chassis.
Jan 20 10:09:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:07Z|00797|binding|INFO|5c6391be-1db0-417c-a94e-89eb0cdbd8e7: Claiming fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.497 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:07Z|00798|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 ovn-installed in OVS
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588920 systemd-udevd[290653]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:09:07 np0005588920 systemd-machined[196121]: New machine qemu-82-instance-000000ab.
Jan 20 10:09:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:07Z|00799|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 up in Southbound
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.527 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.529 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.531 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:09:07 np0005588920 systemd[1]: Started Virtual Machine qemu-82-instance-000000ab.
Jan 20 10:09:07 np0005588920 NetworkManager[49076]: <info>  [1768921747.5392] device (tap5c6391be-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:09:07 np0005588920 NetworkManager[49076]: <info>  [1768921747.5402] device (tap5c6391be-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.547 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[068124f8-c2a9-45b3-9c35-afae65cf9e68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.580 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3aed52e7-ec23-4eb4-905c-6f47430f0b00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.584 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8356b269-714d-4a60-b90b-921ccc98b75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.612 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9252cb66-795a-4a47-abf7-eeb481814490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.628 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e68dcd93-dd9e-4fbd-bcb4-8b750a515bdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290667, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.641 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2378dfbd-ff52-43df-9b13-3a93122186f4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290668, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290668, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.643 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.644 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.645 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.645 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.646 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.646 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:07.647 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:07.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.979 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921747.9795723, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:07 np0005588920 nova_compute[226886]: 2026-01-20 15:09:07.980 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.012 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.017 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921747.9825923, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.017 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.035 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.038 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.060 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.407 226890 DEBUG nova.compute.manager [req-533300e2-5f19-4850-8e29-ba28c7a59a65 req-b91d24f1-493e-44a3-96eb-47761faab091 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.407 226890 DEBUG oslo_concurrency.lockutils [req-533300e2-5f19-4850-8e29-ba28c7a59a65 req-b91d24f1-493e-44a3-96eb-47761faab091 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.408 226890 DEBUG oslo_concurrency.lockutils [req-533300e2-5f19-4850-8e29-ba28c7a59a65 req-b91d24f1-493e-44a3-96eb-47761faab091 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.408 226890 DEBUG oslo_concurrency.lockutils [req-533300e2-5f19-4850-8e29-ba28c7a59a65 req-b91d24f1-493e-44a3-96eb-47761faab091 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.408 226890 DEBUG nova.compute.manager [req-533300e2-5f19-4850-8e29-ba28c7a59a65 req-b91d24f1-493e-44a3-96eb-47761faab091 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Processing event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.409 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.412 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921748.411983, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.413 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.415 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.418 226890 INFO nova.virt.libvirt.driver [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance spawned successfully.#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.418 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.434 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.436 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.449 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.450 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.450 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.451 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.452 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.452 226890 DEBUG nova.virt.libvirt.driver [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.475 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.509 226890 INFO nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Took 6.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.509 226890 DEBUG nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.559 226890 INFO nova.compute.manager [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Took 7.69 seconds to build instance.#033[00m
Jan 20 10:09:08 np0005588920 nova_compute[226886]: 2026-01-20 15:09:08.585 226890 DEBUG oslo_concurrency.lockutils [None req-2395b2b9-b5c5-41fc-8487-a8923de9a62d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:08.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:09.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.484 226890 DEBUG nova.compute.manager [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.484 226890 DEBUG oslo_concurrency.lockutils [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.484 226890 DEBUG oslo_concurrency.lockutils [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.485 226890 DEBUG oslo_concurrency.lockutils [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.485 226890 DEBUG nova.compute.manager [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:10 np0005588920 nova_compute[226886]: 2026-01-20 15:09:10.486 226890 WARNING nova.compute.manager [req-ef28d9cd-98dd-45c1-8a0d-ee34735098ed req-11f76b99-593e-40eb-bcfc-4649adec8268 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:10.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:11 np0005588920 nova_compute[226886]: 2026-01-20 15:09:11.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 20 10:09:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:11 np0005588920 nova_compute[226886]: 2026-01-20 15:09:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:11 np0005588920 nova_compute[226886]: 2026-01-20 15:09:11.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:09:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:12 np0005588920 nova_compute[226886]: 2026-01-20 15:09:12.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:12.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:12 np0005588920 nova_compute[226886]: 2026-01-20 15:09:12.922 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:12 np0005588920 nova_compute[226886]: 2026-01-20 15:09:12.922 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:09:12 np0005588920 nova_compute[226886]: 2026-01-20 15:09:12.963 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:09:13 np0005588920 NetworkManager[49076]: <info>  [1768921753.0310] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Jan 20 10:09:13 np0005588920 NetworkManager[49076]: <info>  [1768921753.0319] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.030 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.199 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:13 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:13Z|00800|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.560 226890 DEBUG nova.compute.manager [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.560 226890 DEBUG nova.compute.manager [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing instance network info cache due to event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.560 226890 DEBUG oslo_concurrency.lockutils [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.561 226890 DEBUG oslo_concurrency.lockutils [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:13 np0005588920 nova_compute[226886]: 2026-01-20 15:09:13.561 226890 DEBUG nova.network.neutron [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:13.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:14 np0005588920 podman[290764]: 2026-01-20 15:09:14.030691067 +0000 UTC m=+0.119944884 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 20 10:09:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:14.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:15 np0005588920 nova_compute[226886]: 2026-01-20 15:09:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:15.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:16 np0005588920 nova_compute[226886]: 2026-01-20 15:09:16.122 226890 DEBUG nova.network.neutron [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updated VIF entry in instance network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:16 np0005588920 nova_compute[226886]: 2026-01-20 15:09:16.123 226890 DEBUG nova.network.neutron [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:16 np0005588920 nova_compute[226886]: 2026-01-20 15:09:16.149 226890 DEBUG oslo_concurrency.lockutils [req-f1a320d1-f20c-41ee-a6fc-268c35f958d1 req-25e69cbd-ee23-4ea1-95a0-4fd607642e3e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:16.469 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:16.470 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:16.470 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 20 10:09:16 np0005588920 nova_compute[226886]: 2026-01-20 15:09:16.531 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:16.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:17 np0005588920 nova_compute[226886]: 2026-01-20 15:09:17.032 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:17.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:18.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:19 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:19Z|00801|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:09:19 np0005588920 nova_compute[226886]: 2026-01-20 15:09:19.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:09:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:19.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:09:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:20.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:21.385 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:21 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:21.386 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:09:21 np0005588920 nova_compute[226886]: 2026-01-20 15:09:21.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:21 np0005588920 nova_compute[226886]: 2026-01-20 15:09:21.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:21.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:22 np0005588920 nova_compute[226886]: 2026-01-20 15:09:22.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:22.387 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:22.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:23Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:09:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:23Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:09:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:23.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:09:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:24.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:09:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:25.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:25 np0005588920 podman[290793]: 2026-01-20 15:09:25.992770656 +0000 UTC m=+0.065697614 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:09:26 np0005588920 nova_compute[226886]: 2026-01-20 15:09:26.535 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:26.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:27 np0005588920 nova_compute[226886]: 2026-01-20 15:09:27.035 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:27.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:28.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.069 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.070 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.085 226890 DEBUG nova.objects.instance [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.117 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.420 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.420 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.421 226890 INFO nova.compute.manager [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Attaching volume 7d67106f-2f4c-4925-94d7-b60a4418b999 to /dev/vdb#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.597 226890 DEBUG os_brick.utils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.598 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.608 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.609 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cd223a-ee90-4ddf-acc2-036057f154b2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.610 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.618 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.618 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f4dae093-baf8-4d6f-a3e9-cb9c0e5f8b21]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.620 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.628 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.628 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[83ae0410-e155-46fe-b1df-e04d6103a4f8]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.630 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[c100ade7-7d23-48f8-8063-f0dbc4ff470e]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.630 226890 DEBUG oslo_concurrency.processutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.663 226890 DEBUG oslo_concurrency.processutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "nvme version" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.665 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.665 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.666 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.666 226890 DEBUG os_brick.utils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:09:29 np0005588920 nova_compute[226886]: 2026-01-20 15:09:29.666 226890 DEBUG nova.virt.block_device [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating existing volume attachment record: ce0e8f37-e6e6-4da1-81ab-5ab109435e13 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:09:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:29.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1946179170' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.448 226890 DEBUG nova.objects.instance [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.469 226890 DEBUG nova.virt.libvirt.driver [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Attempting to attach volume 7d67106f-2f4c-4925-94d7-b60a4418b999 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.471 226890 DEBUG nova.virt.libvirt.guest [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-7d67106f-2f4c-4925-94d7-b60a4418b999">
Jan 20 10:09:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:09:30 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:09:30 np0005588920 nova_compute[226886]:  <serial>7d67106f-2f4c-4925-94d7-b60a4418b999</serial>
Jan 20 10:09:30 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:09:30 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.598 226890 DEBUG nova.virt.libvirt.driver [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.598 226890 DEBUG nova.virt.libvirt.driver [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.599 226890 DEBUG nova.virt.libvirt.driver [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.599 226890 DEBUG nova.virt.libvirt.driver [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:9d:e5:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:09:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:30.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:30 np0005588920 nova_compute[226886]: 2026-01-20 15:09:30.821 226890 DEBUG oslo_concurrency.lockutils [None req-ae52b400-8fdc-4b67-9f23-a80b08043c39 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:31 np0005588920 nova_compute[226886]: 2026-01-20 15:09:31.537 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:31 np0005588920 nova_compute[226886]: 2026-01-20 15:09:31.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:31.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:32 np0005588920 nova_compute[226886]: 2026-01-20 15:09:32.038 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:32 np0005588920 nova_compute[226886]: 2026-01-20 15:09:32.233 226890 INFO nova.compute.manager [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Rescuing#033[00m
Jan 20 10:09:32 np0005588920 nova_compute[226886]: 2026-01-20 15:09:32.233 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:32 np0005588920 nova_compute[226886]: 2026-01-20 15:09:32.234 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:32 np0005588920 nova_compute[226886]: 2026-01-20 15:09:32.234 226890 DEBUG nova.network.neutron [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:09:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:32.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:33 np0005588920 nova_compute[226886]: 2026-01-20 15:09:33.507 226890 DEBUG nova.network.neutron [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:33 np0005588920 nova_compute[226886]: 2026-01-20 15:09:33.524 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:33.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:33 np0005588920 nova_compute[226886]: 2026-01-20 15:09:33.893 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:09:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:34.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:35.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:35 np0005588920 nova_compute[226886]: 2026-01-20 15:09:35.980 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 kernel: tap5c6391be-1d (unregistering): left promiscuous mode
Jan 20 10:09:36 np0005588920 NetworkManager[49076]: <info>  [1768921776.3218] device (tap5c6391be-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:36Z|00802|binding|INFO|Releasing lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 from this chassis (sb_readonly=0)
Jan 20 10:09:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:36Z|00803|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 down in Southbound
Jan 20 10:09:36 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:36Z|00804|binding|INFO|Removing iface tap5c6391be-1d ovn-installed in OVS
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.343 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.344 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.345 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.351 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.364 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f67b909b-bb15-4307-83b4-906313dd5233]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.395 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6531627c-7300-4531-bc9f-6700c12ac367]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.397 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5419c7b3-7c11-4082-bf0a-1eb50bd8e692]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 20 10:09:36 np0005588920 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ab.scope: Consumed 16.146s CPU time.
Jan 20 10:09:36 np0005588920 systemd-machined[196121]: Machine qemu-82-instance-000000ab terminated.
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.426 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[429b261a-c87d-41a0-8745-964dd119bf68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.441 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[64a3fc8e-fe90-4176-b17b-20a670e5df29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290850, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.460 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d3db3f7a-ba33-4539-9289-2b5e03f960e3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290851, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290851, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.461 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.463 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.467 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.467 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.467 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:36.468 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.539 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:36.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.905 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.910 226890 INFO nova.virt.libvirt.driver [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance destroyed successfully.#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.910 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'numa_topology' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.931 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Attempting rescue#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.931 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.935 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.935 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Creating image(s)#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.958 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:36 np0005588920 nova_compute[226886]: 2026-01-20 15:09:36.962 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'trusted_certs' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.003 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.030 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.034 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.110 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.111 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.112 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.112 226890 DEBUG oslo_concurrency.lockutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.136 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.139 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.403 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.264s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.404 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.415 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.416 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start _get_guest_xml network_info=[{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:9d:e5:5b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.416 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.430 226890 WARNING nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.438 226890 DEBUG nova.virt.libvirt.host [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.438 226890 DEBUG nova.virt.libvirt.host [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.441 226890 DEBUG nova.virt.libvirt.host [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.441 226890 DEBUG nova.virt.libvirt.host [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.442 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.443 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.443 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.443 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.444 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.444 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.444 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.444 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.445 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.445 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.445 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.445 226890 DEBUG nova.virt.hardware [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.446 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'vcpu_model' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.459 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:37.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3710696462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.902 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:37 np0005588920 nova_compute[226886]: 2026-01-20 15:09:37.904 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2564933822' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.357 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.359 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/628547865' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:38.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.849 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.852 226890 DEBUG nova.virt.libvirt.vif [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1321021662',display_name='tempest-ServerRescueNegativeTestJSON-server-1321021662',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1321021662',id=171,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/7uXKxe9nAPl14X3fTp2ccJ5Be2YZUIKmP54MYpo0vFkLM4vJyo+K5ySbdF/GxgqpIyyKbMYSgP6x/brvrQahBSInMKWnh7cc52EbXbOHcGpmF2QhgzpimmyzN8oX4hw==',key_name='tempest-keypair-1299086936',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:09:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-4rgzfbcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:09:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='27658864f96d453586dd0846a4c55b7d',uuid=f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:9d:e5:5b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.853 226890 DEBUG nova.network.os_vif_util [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:9d:e5:5b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.854 226890 DEBUG nova.network.os_vif_util [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.855 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.877 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <uuid>f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</uuid>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <name>instance-000000ab</name>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1321021662</nova:name>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:09:37</nova:creationTime>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <nova:port uuid="5c6391be-1db0-417c-a94e-89eb0cdbd8e7">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="serial">f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="uuid">f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.rescue">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config.rescue">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:9d:e5:5b"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <target dev="tap5c6391be-1d"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/console.log" append="off"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:09:38 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:09:38 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:09:38 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:09:38 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.886 226890 INFO nova.virt.libvirt.driver [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance destroyed successfully.#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.939 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.940 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.940 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.940 226890 DEBUG nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:9d:e5:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.941 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Using config drive#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.962 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:38 np0005588920 nova_compute[226886]: 2026-01-20 15:09:38.982 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'ec2_ids' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:39 np0005588920 nova_compute[226886]: 2026-01-20 15:09:39.011 226890 DEBUG nova.objects.instance [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'keypairs' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:39.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:40 np0005588920 nova_compute[226886]: 2026-01-20 15:09:40.724 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Creating config drive at /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue#033[00m
Jan 20 10:09:40 np0005588920 nova_compute[226886]: 2026-01-20 15:09:40.730 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7m4nvmq6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:40.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:40 np0005588920 nova_compute[226886]: 2026-01-20 15:09:40.865 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7m4nvmq6" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:40 np0005588920 nova_compute[226886]: 2026-01-20 15:09:40.892 226890 DEBUG nova.storage.rbd_utils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:09:40 np0005588920 nova_compute[226886]: 2026-01-20 15:09:40.896 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.039 226890 DEBUG oslo_concurrency.processutils [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.040 226890 INFO nova.virt.libvirt.driver [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Deleting local config drive /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:09:41 np0005588920 kernel: tap5c6391be-1d: entered promiscuous mode
Jan 20 10:09:41 np0005588920 NetworkManager[49076]: <info>  [1768921781.0913] manager: (tap5c6391be-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Jan 20 10:09:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:41Z|00805|binding|INFO|Claiming lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for this chassis.
Jan 20 10:09:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:41Z|00806|binding|INFO|5c6391be-1db0-417c-a94e-89eb0cdbd8e7: Claiming fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.093 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.100 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.102 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.105 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:09:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:41Z|00807|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 ovn-installed in OVS
Jan 20 10:09:41 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:41Z|00808|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 up in Southbound
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.118 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:41 np0005588920 systemd-udevd[291097]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:09:41 np0005588920 systemd-machined[196121]: New machine qemu-83-instance-000000ab.
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.124 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.125 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[913291ab-090c-457f-9c3a-204876a7f019]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 NetworkManager[49076]: <info>  [1768921781.1345] device (tap5c6391be-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:09:41 np0005588920 NetworkManager[49076]: <info>  [1768921781.1356] device (tap5c6391be-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:09:41 np0005588920 systemd[1]: Started Virtual Machine qemu-83-instance-000000ab.
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.156 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[17f3782c-eb4e-4496-9477-47349a8e817c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.159 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bc68e5be-5a76-447a-8b4f-6f2fb864c26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.184 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a1da12fb-deb5-486b-8543-e5b39b1cad1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.202 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[50cd6011-827e-4be2-8f06-9bf64ac3199c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 700, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 700, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291109, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.216 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cae06d88-9b7c-44e7-aa3a-0a0c0ac65029]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291111, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291111, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.217 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.219 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.219 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.220 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:41.220 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.541 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.583 226890 DEBUG nova.compute.manager [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.583 226890 DEBUG oslo_concurrency.lockutils [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.583 226890 DEBUG oslo_concurrency.lockutils [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.584 226890 DEBUG oslo_concurrency.lockutils [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.584 226890 DEBUG nova.compute.manager [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.585 226890 WARNING nova.compute.manager [req-01f600d0-4afb-492c-ab56-7c77022aa1cc req-3359d0f1-7007-4007-a729-2e5739c998c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.758 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.758 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921781.7574515, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.759 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.763 226890 DEBUG nova.compute.manager [None req-ab9b6a89-e9bc-4378-b6e3-dfdd48a157b1 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.813 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.817 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.856 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.856 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921781.7586198, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:41 np0005588920 nova_compute[226886]: 2026-01-20 15:09:41.857 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:09:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:41.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:42 np0005588920 nova_compute[226886]: 2026-01-20 15:09:42.029 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:42 np0005588920 nova_compute[226886]: 2026-01-20 15:09:42.032 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:42 np0005588920 nova_compute[226886]: 2026-01-20 15:09:42.041 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:42 np0005588920 nova_compute[226886]: 2026-01-20 15:09:42.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:42.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.828 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.829 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.829 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.830 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.830 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.830 226890 WARNING nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.831 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.831 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.831 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.831 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.832 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.832 226890 WARNING nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.832 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.833 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.833 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.833 226890 DEBUG oslo_concurrency.lockutils [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.834 226890 DEBUG nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:43 np0005588920 nova_compute[226886]: 2026-01-20 15:09:43.834 226890 WARNING nova.compute.manager [req-f3fd24cb-eaf5-4a24-be93-fb1e2028a014 req-3c80064d-91d6-405f-aaa5-91b50badb163 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:09:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:43.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 20 10:09:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:44.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:45 np0005588920 podman[291172]: 2026-01-20 15:09:45.047830293 +0000 UTC m=+0.118687647 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:09:45 np0005588920 nova_compute[226886]: 2026-01-20 15:09:45.353 226890 INFO nova.compute.manager [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Unrescuing#033[00m
Jan 20 10:09:45 np0005588920 nova_compute[226886]: 2026-01-20 15:09:45.354 226890 DEBUG oslo_concurrency.lockutils [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:45 np0005588920 nova_compute[226886]: 2026-01-20 15:09:45.355 226890 DEBUG oslo_concurrency.lockutils [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:45 np0005588920 nova_compute[226886]: 2026-01-20 15:09:45.355 226890 DEBUG nova.network.neutron [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:09:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:45.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 20 10:09:46 np0005588920 nova_compute[226886]: 2026-01-20 15:09:46.543 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:46.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.044 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.697 226890 DEBUG nova.network.neutron [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.709 226890 DEBUG oslo_concurrency.lockutils [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.710 226890 DEBUG nova.objects.instance [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:47 np0005588920 kernel: tap5c6391be-1d (unregistering): left promiscuous mode
Jan 20 10:09:47 np0005588920 NetworkManager[49076]: <info>  [1768921787.7799] device (tap5c6391be-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:47Z|00809|binding|INFO|Releasing lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 from this chassis (sb_readonly=0)
Jan 20 10:09:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:47Z|00810|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 down in Southbound
Jan 20 10:09:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:47Z|00811|binding|INFO|Removing iface tap5c6391be-1d ovn-installed in OVS
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.806 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:47 np0005588920 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 20 10:09:47 np0005588920 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ab.scope: Consumed 6.799s CPU time.
Jan 20 10:09:47 np0005588920 systemd-machined[196121]: Machine qemu-83-instance-000000ab terminated.
Jan 20 10:09:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.873 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '6', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.875 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:09:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:47.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.876 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.893 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[302316b3-b838-473f-8785-4ef4a17b77d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.922 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f7806ccd-ddc3-427f-ae43-30f84298127d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.925 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca6b70f-25ce-4d5d-96e5-96b6d7f54cfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.959 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b9454d-11f9-4bea-830e-6446292adaee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.966 226890 INFO nova.virt.libvirt.driver [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance destroyed successfully.#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.966 226890 DEBUG nova.objects.instance [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'numa_topology' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.979 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e23dd8dc-8cc3-4a25-b951-81e9258db1ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 700, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 700, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291224, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.995 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f799bd-4b22-4709-86b9-3dc561d421f0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291225, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291225, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:47.996 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:47 np0005588920 nova_compute[226886]: 2026-01-20 15:09:47.997 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.001 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.002 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.002 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.002 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.003 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:48 np0005588920 systemd-udevd[291203]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:09:48 np0005588920 kernel: tap5c6391be-1d: entered promiscuous mode
Jan 20 10:09:48 np0005588920 NetworkManager[49076]: <info>  [1768921788.0614] manager: (tap5c6391be-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/384)
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:48Z|00812|binding|INFO|Claiming lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for this chassis.
Jan 20 10:09:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:48Z|00813|binding|INFO|5c6391be-1db0-417c-a94e-89eb0cdbd8e7: Claiming fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:09:48 np0005588920 NetworkManager[49076]: <info>  [1768921788.0720] device (tap5c6391be-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:09:48 np0005588920 NetworkManager[49076]: <info>  [1768921788.0733] device (tap5c6391be-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.075 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '6', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.076 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:09:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:48Z|00814|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 ovn-installed in OVS
Jan 20 10:09:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:09:48Z|00815|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 up in Southbound
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.078 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.078 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.082 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 systemd-machined[196121]: New machine qemu-84-instance-000000ab.
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.092 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9c73b3c7-e027-4dfd-b00a-c84942786578]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 systemd[1]: Started Virtual Machine qemu-84-instance-000000ab.
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.121 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f9236f-a8cb-4775-8774-f31596026723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.124 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6922dd-4665-4ef7-bf44-01010bafd953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.152 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2d27dc-f001-444c-ae90-e6a25fafb648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.169 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[14d29d2d-16a3-4af1-8a15-d07e644e477a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 700, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 700, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291252, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.185 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ef868e-6339-4122-9476-33a9b61ecdfc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291253, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291253, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.186 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.189 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.189 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.190 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.190 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:09:48.190 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.397 226890 DEBUG nova.compute.manager [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.397 226890 DEBUG oslo_concurrency.lockutils [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.398 226890 DEBUG oslo_concurrency.lockutils [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.398 226890 DEBUG oslo_concurrency.lockutils [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.398 226890 DEBUG nova.compute.manager [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.398 226890 WARNING nova.compute.manager [req-4ad8e153-37f8-4410-97b8-f2000f9b6ea0 req-06bf39de-d176-4995-b129-3206b0990961 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.516 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.516 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921788.4985454, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.517 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.553 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.557 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.578 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.579 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921788.499055, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.579 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.598 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.601 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.620 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:09:48 np0005588920 nova_compute[226886]: 2026-01-20 15:09:48.818 226890 DEBUG nova.compute.manager [None req-6cab8748-1180-4b00-b21c-f7c5e5007ccf 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:48.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:49.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.497 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.498 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.498 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.499 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.499 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.499 226890 WARNING nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.500 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.500 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.500 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.500 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.501 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.501 226890 WARNING nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.501 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.501 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.502 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.502 226890 DEBUG oslo_concurrency.lockutils [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.502 226890 DEBUG nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:50 np0005588920 nova_compute[226886]: 2026-01-20 15:09:50.503 226890 WARNING nova.compute.manager [req-952b050c-a5c0-4150-ba6f-704e68a20628 req-dc33f7e6-70e3-483a-81f8-194354ae7100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:50.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.438 226890 DEBUG nova.compute.manager [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.438 226890 DEBUG nova.compute.manager [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing instance network info cache due to event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.439 226890 DEBUG oslo_concurrency.lockutils [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.439 226890 DEBUG oslo_concurrency.lockutils [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.439 226890 DEBUG nova.network.neutron [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:51 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 20 10:09:51 np0005588920 nova_compute[226886]: 2026-01-20 15:09:51.545 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000058s ======
Jan 20 10:09:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:51.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.042 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.061 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Triggering sync for uuid 2a67c102-89d1-4196-bc8e-663656945547 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.061 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Triggering sync for uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.062 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.063 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "2a67c102-89d1-4196-bc8e-663656945547" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.063 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.064 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.097 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.098 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "2a67c102-89d1-4196-bc8e-663656945547" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.747 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.748 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:09:52 np0005588920 nova_compute[226886]: 2026-01-20 15:09:52.748 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:09:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:52.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.107 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.107 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.108 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.108 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.434 226890 DEBUG nova.network.neutron [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updated VIF entry in instance network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.435 226890 DEBUG nova.network.neutron [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.448 226890 DEBUG oslo_concurrency.lockutils [req-0db5f874-9b56-4575-a838-4ef0ec8e5ade req-57049fa8-ff3d-4456-ba39-8c00903dec70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.544 226890 DEBUG nova.compute.manager [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.544 226890 DEBUG nova.compute.manager [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing instance network info cache due to event network-changed-5c6391be-1db0-417c-a94e-89eb0cdbd8e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.545 226890 DEBUG oslo_concurrency.lockutils [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.545 226890 DEBUG oslo_concurrency.lockutils [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:53 np0005588920 nova_compute[226886]: 2026-01-20 15:09:53.545 226890 DEBUG nova.network.neutron [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Refreshing network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000059s ======
Jan 20 10:09:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:53.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 20 10:09:54 np0005588920 nova_compute[226886]: 2026-01-20 15:09:54.475 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [{"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:54 np0005588920 nova_compute[226886]: 2026-01-20 15:09:54.491 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-2a67c102-89d1-4196-bc8e-663656945547" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:54 np0005588920 nova_compute[226886]: 2026-01-20 15:09:54.491 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:09:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:54.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:55 np0005588920 nova_compute[226886]: 2026-01-20 15:09:55.490 226890 DEBUG nova.network.neutron [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updated VIF entry in instance network info cache for port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:55 np0005588920 nova_compute[226886]: 2026-01-20 15:09:55.492 226890 DEBUG nova.network.neutron [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [{"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:55 np0005588920 nova_compute[226886]: 2026-01-20 15:09:55.516 226890 DEBUG oslo_concurrency.lockutils [req-0f4fc472-eb93-4efd-9ac5-3c0f79a6bb6e req-8afb58c1-48ac-4751-bb69-88f471aaed8c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:55.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:56 np0005588920 nova_compute[226886]: 2026-01-20 15:09:56.548 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:56.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:56 np0005588920 podman[291333]: 2026-01-20 15:09:56.976552102 +0000 UTC m=+0.058059031 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.768 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.768 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.769 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.769 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:09:57 np0005588920 nova_compute[226886]: 2026-01-20 15:09:57.769 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:09:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:57.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:09:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:58 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3512189121' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.212 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.287 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.288 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000a7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.291 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.291 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.292 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.451 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.453 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3858MB free_disk=20.80986785888672GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.453 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.454 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.530 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 2a67c102-89d1-4196-bc8e-663656945547 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.531 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.531 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.531 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:09:58 np0005588920 nova_compute[226886]: 2026-01-20 15:09:58.588 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:58.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/225497052' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:59 np0005588920 nova_compute[226886]: 2026-01-20 15:09:59.040 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:59 np0005588920 nova_compute[226886]: 2026-01-20 15:09:59.047 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:09:59 np0005588920 nova_compute[226886]: 2026-01-20 15:09:59.062 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:09:59 np0005588920 nova_compute[226886]: 2026-01-20 15:09:59.100 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:09:59 np0005588920 nova_compute[226886]: 2026-01-20 15:09:59.101 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:09:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:59.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:10:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:00.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:01 np0005588920 nova_compute[226886]: 2026-01-20 15:10:01.101 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:01 np0005588920 nova_compute[226886]: 2026-01-20 15:10:01.103 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:01 np0005588920 nova_compute[226886]: 2026-01-20 15:10:01.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:01 np0005588920 nova_compute[226886]: 2026-01-20 15:10:01.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:01.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:02 np0005588920 nova_compute[226886]: 2026-01-20 15:10:02.049 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:02Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:e5:5b 10.100.0.12
Jan 20 10:10:02 np0005588920 nova_compute[226886]: 2026-01-20 15:10:02.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:02 np0005588920 nova_compute[226886]: 2026-01-20 15:10:02.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:02 np0005588920 nova_compute[226886]: 2026-01-20 15:10:02.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:02 np0005588920 nova_compute[226886]: 2026-01-20 15:10:02.724 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:10:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:02.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:03.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:04.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:10:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2067592141' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:10:05 np0005588920 nova_compute[226886]: 2026-01-20 15:10:05.722 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:05.722 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:05.724 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:10:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:05.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:06 np0005588920 nova_compute[226886]: 2026-01-20 15:10:06.554 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:06.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:07 np0005588920 nova_compute[226886]: 2026-01-20 15:10:07.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:07.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:10:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:08.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:10:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:09.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:10.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:11 np0005588920 nova_compute[226886]: 2026-01-20 15:10:11.558 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:11.725 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:11.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:12 np0005588920 nova_compute[226886]: 2026-01-20 15:10:12.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:10:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:12.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:10:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:13.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:10:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:14.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:15.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:15 np0005588920 podman[291650]: 2026-01-20 15:10:15.995242071 +0000 UTC m=+0.085373097 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:16.470 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:16.471 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:16.471 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:16 np0005588920 nova_compute[226886]: 2026-01-20 15:10:16.559 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:16.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:17 np0005588920 nova_compute[226886]: 2026-01-20 15:10:17.055 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.698624) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817698700, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 1924, "num_deletes": 255, "total_data_size": 4164916, "memory_usage": 4221984, "flush_reason": "Manual Compaction"}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817726834, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 2722827, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 61597, "largest_seqno": 63516, "table_properties": {"data_size": 2714791, "index_size": 4786, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17713, "raw_average_key_size": 20, "raw_value_size": 2698424, "raw_average_value_size": 3182, "num_data_blocks": 207, "num_entries": 848, "num_filter_entries": 848, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921681, "oldest_key_time": 1768921681, "file_creation_time": 1768921817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 28248 microseconds, and 6677 cpu microseconds.
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.726879) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 2722827 bytes OK
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.726897) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.733623) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.733642) EVENT_LOG_v1 {"time_micros": 1768921817733637, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.733659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 4156125, prev total WAL file size 4156125, number of live WAL files 2.
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.734810) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(2659KB)], [123(10MB)]
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817734929, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 14183532, "oldest_snapshot_seqno": -1}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8787 keys, 12255251 bytes, temperature: kUnknown
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817856108, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 12255251, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12196493, "index_size": 35670, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 230303, "raw_average_key_size": 26, "raw_value_size": 12039912, "raw_average_value_size": 1370, "num_data_blocks": 1372, "num_entries": 8787, "num_filter_entries": 8787, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.865504) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 12255251 bytes
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.867404) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 117.0 rd, 101.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 10.9 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(9.7) write-amplify(4.5) OK, records in: 9316, records dropped: 529 output_compression: NoCompression
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.867431) EVENT_LOG_v1 {"time_micros": 1768921817867421, "job": 78, "event": "compaction_finished", "compaction_time_micros": 121238, "compaction_time_cpu_micros": 36904, "output_level": 6, "num_output_files": 1, "total_output_size": 12255251, "num_input_records": 9316, "num_output_records": 8787, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817868261, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817870085, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.734677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.870262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.870273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.870274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.870276) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:10:17.870278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:17.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:18.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:19.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:20.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:21 np0005588920 nova_compute[226886]: 2026-01-20 15:10:21.623 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:21.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:22 np0005588920 nova_compute[226886]: 2026-01-20 15:10:22.058 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:22.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.484 226890 DEBUG oslo_concurrency.lockutils [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.484 226890 DEBUG oslo_concurrency.lockutils [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.501 226890 INFO nova.compute.manager [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Detaching volume 7d67106f-2f4c-4925-94d7-b60a4418b999#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.634 226890 INFO nova.virt.block_device [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Attempting to driver detach volume 7d67106f-2f4c-4925-94d7-b60a4418b999 from mountpoint /dev/vdb#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.645 226890 DEBUG nova.virt.libvirt.driver [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Attempting to detach device vdb from instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.645 226890 DEBUG nova.virt.libvirt.guest [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-7d67106f-2f4c-4925-94d7-b60a4418b999">
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <serial>7d67106f-2f4c-4925-94d7-b60a4418b999</serial>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:10:23 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.652 226890 INFO nova.virt.libvirt.driver [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully detached device vdb from instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 from the persistent domain config.#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.652 226890 DEBUG nova.virt.libvirt.driver [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.653 226890 DEBUG nova.virt.libvirt.guest [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-7d67106f-2f4c-4925-94d7-b60a4418b999">
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <serial>7d67106f-2f4c-4925-94d7-b60a4418b999</serial>
Jan 20 10:10:23 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:10:23 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:10:23 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.703 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768921823.7028294, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.705 226890 DEBUG nova.virt.libvirt.driver [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.707 226890 INFO nova.virt.libvirt.driver [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully detached device vdb from instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 from the live domain config.#033[00m
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.869 226890 DEBUG nova.objects.instance [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'flavor' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:23.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:23 np0005588920 nova_compute[226886]: 2026-01-20 15:10:23.932 226890 DEBUG oslo_concurrency.lockutils [None req-39e65452-e610-4141-8b29-dc58f4ea4a4d 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:24.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.075 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.076 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.077 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.077 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.077 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.079 226890 INFO nova.compute.manager [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Terminating instance#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.081 226890 DEBUG nova.compute.manager [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:10:25 np0005588920 kernel: tap5c6391be-1d (unregistering): left promiscuous mode
Jan 20 10:10:25 np0005588920 NetworkManager[49076]: <info>  [1768921825.1455] device (tap5c6391be-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:10:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:25Z|00816|binding|INFO|Releasing lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 from this chassis (sb_readonly=0)
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:25Z|00817|binding|INFO|Setting lport 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 down in Southbound
Jan 20 10:10:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:25Z|00818|binding|INFO|Removing iface tap5c6391be-1d ovn-installed in OVS
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.165 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.183 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.184 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:e5:5b 10.100.0.12'], port_security=['fa:16:3e:9d:e5:5b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27c5bfa6-4744-47cb-ac6a-c4a6b0a5e776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5c6391be-1db0-417c-a94e-89eb0cdbd8e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.185 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6391be-1db0-417c-a94e-89eb0cdbd8e7 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.186 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.199 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f08fd3e-a909-4dcc-a8a8-7d33f711c2e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 20 10:10:25 np0005588920 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ab.scope: Consumed 13.814s CPU time.
Jan 20 10:10:25 np0005588920 systemd-machined[196121]: Machine qemu-84-instance-000000ab terminated.
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.224 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[844b1ef6-2631-453f-a945-7885876ef2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.227 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9514b0-43e7-4910-8657-6d65a5131db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.250 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[bc393308-b6f0-4a15-9869-25ec0e19b458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.267 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5045d9d-d370-4507-a18d-c7d01ee42b0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 16, 'rx_bytes': 784, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 16, 'rx_bytes': 784, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677247, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291741, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.282 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe1e963-58c1-43a0-b7a0-edd1727281f8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677257, 'tstamp': 677257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291742, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3967ae21-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677260, 'tstamp': 677260}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291742, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.283 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.324 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.325 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.325 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.325 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:25.326 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.355 226890 INFO nova.virt.libvirt.driver [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Instance destroyed successfully.#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.355 226890 DEBUG nova.objects.instance [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.372 226890 DEBUG nova.virt.libvirt.vif [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1321021662',display_name='tempest-ServerRescueNegativeTestJSON-server-1321021662',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1321021662',id=171,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/7uXKxe9nAPl14X3fTp2ccJ5Be2YZUIKmP54MYpo0vFkLM4vJyo+K5ySbdF/GxgqpIyyKbMYSgP6x/brvrQahBSInMKWnh7cc52EbXbOHcGpmF2QhgzpimmyzN8oX4hw==',key_name='tempest-keypair-1299086936',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:09:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-4rgzfbcx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:09:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='27658864f96d453586dd0846a4c55b7d',uuid=f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.373 226890 DEBUG nova.network.os_vif_util [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "address": "fa:16:3e:9d:e5:5b", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6391be-1d", "ovs_interfaceid": "5c6391be-1db0-417c-a94e-89eb0cdbd8e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.374 226890 DEBUG nova.network.os_vif_util [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.374 226890 DEBUG os_vif [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.376 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c6391be-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.379 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.380 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.383 226890 INFO os_vif [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:e5:5b,bridge_name='br-int',has_traffic_filtering=True,id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6391be-1d')#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.482 226890 DEBUG nova.compute.manager [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.482 226890 DEBUG oslo_concurrency.lockutils [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.484 226890 DEBUG oslo_concurrency.lockutils [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.485 226890 DEBUG oslo_concurrency.lockutils [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.485 226890 DEBUG nova.compute.manager [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.486 226890 DEBUG nova.compute.manager [req-18b57bd7-a8c9-4463-8051-e1ac5302e897 req-1fe6b764-5886-47e3-9984-f6420076ec15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-unplugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.753 226890 INFO nova.virt.libvirt.driver [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Deleting instance files /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_del#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.755 226890 INFO nova.virt.libvirt.driver [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Deletion of /var/lib/nova/instances/f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1_del complete#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.815 226890 INFO nova.compute.manager [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.816 226890 DEBUG oslo.service.loopingcall [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.816 226890 DEBUG nova.compute.manager [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:10:25 np0005588920 nova_compute[226886]: 2026-01-20 15:10:25.817 226890 DEBUG nova.network.neutron [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:10:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:25.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:26.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.709 226890 DEBUG nova.compute.manager [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.710 226890 DEBUG oslo_concurrency.lockutils [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.710 226890 DEBUG oslo_concurrency.lockutils [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.710 226890 DEBUG oslo_concurrency.lockutils [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.710 226890 DEBUG nova.compute.manager [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] No waiting events found dispatching network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.711 226890 WARNING nova.compute.manager [req-70acee2a-1511-420e-820c-2ace00fef983 req-57c72928-4749-43ae-9f00-3b0b39c42205 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received unexpected event network-vif-plugged-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:10:27 np0005588920 nova_compute[226886]: 2026-01-20 15:10:27.723 226890 DEBUG nova.network.neutron [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:27.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:27 np0005588920 podman[291774]: 2026-01-20 15:10:27.968100715 +0000 UTC m=+0.051709667 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.014 226890 DEBUG nova.compute.manager [req-b9cba53d-a060-4fc9-bbd8-28e1a6ef7f2a req-10fec4b9-60c0-472c-a0de-283eca01b51a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Received event network-vif-deleted-5c6391be-1db0-417c-a94e-89eb0cdbd8e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.015 226890 INFO nova.compute.manager [req-b9cba53d-a060-4fc9-bbd8-28e1a6ef7f2a req-10fec4b9-60c0-472c-a0de-283eca01b51a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Neutron deleted interface 5c6391be-1db0-417c-a94e-89eb0cdbd8e7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.015 226890 DEBUG nova.network.neutron [req-b9cba53d-a060-4fc9-bbd8-28e1a6ef7f2a req-10fec4b9-60c0-472c-a0de-283eca01b51a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.017 226890 INFO nova.compute.manager [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Took 2.20 seconds to deallocate network for instance.#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.056 226890 DEBUG nova.compute.manager [req-b9cba53d-a060-4fc9-bbd8-28e1a6ef7f2a req-10fec4b9-60c0-472c-a0de-283eca01b51a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Detach interface failed, port_id=5c6391be-1db0-417c-a94e-89eb0cdbd8e7, reason: Instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.128 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.129 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.205 226890 DEBUG oslo_concurrency.processutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3315978454' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.628 226890 DEBUG oslo_concurrency.processutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.634 226890 DEBUG nova.compute.provider_tree [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.827 226890 DEBUG nova.scheduler.client.report [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.860 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:28.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:28 np0005588920 nova_compute[226886]: 2026-01-20 15:10:28.922 226890 INFO nova.scheduler.client.report [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Deleted allocations for instance f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1#033[00m
Jan 20 10:10:29 np0005588920 nova_compute[226886]: 2026-01-20 15:10:29.028 226890 DEBUG oslo_concurrency.lockutils [None req-0baba652-f080-482b-95b7-e2c897294c2c 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:29.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:30 np0005588920 nova_compute[226886]: 2026-01-20 15:10:30.452 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:30.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:31.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:32 np0005588920 nova_compute[226886]: 2026-01-20 15:10:32.089 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:32.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:33.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:34.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:35 np0005588920 nova_compute[226886]: 2026-01-20 15:10:35.601 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:35.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:10:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:36.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:10:37 np0005588920 nova_compute[226886]: 2026-01-20 15:10:37.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:37.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 20 10:10:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:38.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:39.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:40 np0005588920 nova_compute[226886]: 2026-01-20 15:10:40.354 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921825.3521457, f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:40 np0005588920 nova_compute[226886]: 2026-01-20 15:10:40.355 226890 INFO nova.compute.manager [-] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:10:40 np0005588920 nova_compute[226886]: 2026-01-20 15:10:40.603 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:40 np0005588920 nova_compute[226886]: 2026-01-20 15:10:40.670 226890 DEBUG nova.compute.manager [None req-a7934a47-4c31-439a-801a-cd0059f5eabc - - - - - -] [instance: f2e5fbe3-6cd4-45e7-a3ba-50021d42e4f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:40.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:41.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.629 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.630 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.630 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.630 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.631 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.632 226890 INFO nova.compute.manager [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Terminating instance#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.633 226890 DEBUG nova.compute.manager [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:10:42 np0005588920 kernel: tap12d4a8f6-90 (unregistering): left promiscuous mode
Jan 20 10:10:42 np0005588920 NetworkManager[49076]: <info>  [1768921842.6878] device (tap12d4a8f6-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:10:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:42Z|00819|binding|INFO|Releasing lport 12d4a8f6-904d-4ec5-8062-530c89300b7c from this chassis (sb_readonly=0)
Jan 20 10:10:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:42Z|00820|binding|INFO|Setting lport 12d4a8f6-904d-4ec5-8062-530c89300b7c down in Southbound
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.695 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 ovn_controller[133971]: 2026-01-20T15:10:42Z|00821|binding|INFO|Removing iface tap12d4a8f6-90 ovn-installed in OVS
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.715 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.731 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:df:63 10.100.0.14'], port_security=['fa:16:3e:d4:df:63 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a67c102-89d1-4196-bc8e-663656945547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=12d4a8f6-904d-4ec5-8062-530c89300b7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.733 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 12d4a8f6-904d-4ec5-8062-530c89300b7c in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.734 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3967ae21-1590-4685-8881-8bd1bcf25258, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.735 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3fad68-31b5-4b43-90c7-01db3c35a085]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.735 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace which is not needed anymore#033[00m
Jan 20 10:10:42 np0005588920 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 20 10:10:42 np0005588920 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a7.scope: Consumed 16.660s CPU time.
Jan 20 10:10:42 np0005588920 systemd-machined[196121]: Machine qemu-81-instance-000000a7 terminated.
Jan 20 10:10:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:42 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [NOTICE]   (290124) : haproxy version is 2.8.14-c23fe91
Jan 20 10:10:42 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [NOTICE]   (290124) : path to executable is /usr/sbin/haproxy
Jan 20 10:10:42 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [WARNING]  (290124) : Exiting Master process...
Jan 20 10:10:42 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [ALERT]    (290124) : Current worker (290126) exited with code 143 (Terminated)
Jan 20 10:10:42 np0005588920 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[290120]: [WARNING]  (290124) : All workers exited. Exiting... (0)
Jan 20 10:10:42 np0005588920 systemd[1]: libpod-7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d.scope: Deactivated successfully.
Jan 20 10:10:42 np0005588920 podman[291840]: 2026-01-20 15:10:42.865524753 +0000 UTC m=+0.044260200 container died 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.863 226890 INFO nova.virt.libvirt.driver [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Instance destroyed successfully.#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.864 226890 DEBUG nova.objects.instance [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid 2a67c102-89d1-4196-bc8e-663656945547 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:42 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d-userdata-shm.mount: Deactivated successfully.
Jan 20 10:10:42 np0005588920 systemd[1]: var-lib-containers-storage-overlay-8875d7849dc86efef31f11884c65956a5d5b32fb9ed08c052b43ce30db2f1d38-merged.mount: Deactivated successfully.
Jan 20 10:10:42 np0005588920 podman[291840]: 2026-01-20 15:10:42.906760794 +0000 UTC m=+0.085496211 container cleanup 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:10:42 np0005588920 systemd[1]: libpod-conmon-7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d.scope: Deactivated successfully.
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.915 226890 DEBUG nova.virt.libvirt.vif [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1209466933',display_name='tempest-ServerRescueNegativeTestJSON-server-1209466933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1209466933',id=167,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-79z9txcr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:08:52Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=2a67c102-89d1-4196-bc8e-663656945547,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.916 226890 DEBUG nova.network.os_vif_util [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "address": "fa:16:3e:d4:df:63", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12d4a8f6-90", "ovs_interfaceid": "12d4a8f6-904d-4ec5-8062-530c89300b7c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.916 226890 DEBUG nova.network.os_vif_util [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.917 226890 DEBUG os_vif [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.918 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.918 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12d4a8f6-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.919 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.923 226890 INFO os_vif [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:df:63,bridge_name='br-int',has_traffic_filtering=True,id=12d4a8f6-904d-4ec5-8062-530c89300b7c,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12d4a8f6-90')#033[00m
Jan 20 10:10:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:42.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:42 np0005588920 podman[291878]: 2026-01-20 15:10:42.972053415 +0000 UTC m=+0.042050265 container remove 7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.977 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ce582f34-907e-44c4-8570-133aa86c7b76]: (4, ('Tue Jan 20 03:10:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d)\n7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d\nTue Jan 20 03:10:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d)\n7132772147ba22f582371404780f330d86d04acfa7514028c151d52a85ec372d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.979 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4196b3e7-8179-411b-a69b-40f6237fd2e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.980 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.981 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 kernel: tap3967ae21-10: left promiscuous mode
Jan 20 10:10:42 np0005588920 nova_compute[226886]: 2026-01-20 15:10:42.995 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:42.997 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2dae4b6f-c56c-4c2b-ab49-b1038bc888ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:43.012 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26e52c65-8578-40ce-83f2-35c892411da9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:43.013 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d4979966-52d7-4e8e-916d-35c1957d8f14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:43.027 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[72ddb2a7-c10d-4d13-9286-9a7692bdcbfe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677239, 'reachable_time': 24289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291911, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:43 np0005588920 systemd[1]: run-netns-ovnmeta\x2d3967ae21\x2d1590\x2d4685\x2d8881\x2d8bd1bcf25258.mount: Deactivated successfully.
Jan 20 10:10:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:43.031 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:10:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:10:43.032 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[572794da-cb73-4b4a-a0c0-bba7482eea47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.274 226890 INFO nova.virt.libvirt.driver [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Deleting instance files /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547_del#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.275 226890 INFO nova.virt.libvirt.driver [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Deletion of /var/lib/nova/instances/2a67c102-89d1-4196-bc8e-663656945547_del complete#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.334 226890 DEBUG nova.compute.manager [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.335 226890 DEBUG oslo_concurrency.lockutils [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.336 226890 DEBUG oslo_concurrency.lockutils [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.336 226890 DEBUG oslo_concurrency.lockutils [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.337 226890 DEBUG nova.compute.manager [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.337 226890 DEBUG nova.compute.manager [req-ff09cf42-1f4f-4b29-8dd8-2f0e255ee918 req-e9b5bf17-cbe6-4226-bcb2-600654718690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-unplugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.339 226890 INFO nova.compute.manager [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.340 226890 DEBUG oslo.service.loopingcall [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.340 226890 DEBUG nova.compute.manager [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:10:43 np0005588920 nova_compute[226886]: 2026-01-20 15:10:43.341 226890 DEBUG nova.network.neutron [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:10:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.167 226890 DEBUG nova.network.neutron [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.182 226890 INFO nova.compute.manager [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Took 0.84 seconds to deallocate network for instance.#033[00m
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.253 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.253 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.316 226890 DEBUG oslo_concurrency.processutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1430308450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.742 226890 DEBUG oslo_concurrency.processutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:44 np0005588920 nova_compute[226886]: 2026-01-20 15:10:44.748 226890 DEBUG nova.compute.provider_tree [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:44.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.024 226890 DEBUG nova.scheduler.client.report [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.217 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.322 226890 INFO nova.scheduler.client.report [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Deleted allocations for instance 2a67c102-89d1-4196-bc8e-663656945547#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.526 226890 DEBUG nova.compute.manager [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.527 226890 DEBUG oslo_concurrency.lockutils [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2a67c102-89d1-4196-bc8e-663656945547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.527 226890 DEBUG oslo_concurrency.lockutils [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.527 226890 DEBUG oslo_concurrency.lockutils [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.527 226890 DEBUG nova.compute.manager [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] No waiting events found dispatching network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.528 226890 WARNING nova.compute.manager [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received unexpected event network-vif-plugged-12d4a8f6-904d-4ec5-8062-530c89300b7c for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.528 226890 DEBUG nova.compute.manager [req-40d7ece2-2f28-4a6e-89f1-2615faf93b3b req-a342747c-7d0d-4191-9e78-60ba3c756df3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Received event network-vif-deleted-12d4a8f6-904d-4ec5-8062-530c89300b7c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:45 np0005588920 nova_compute[226886]: 2026-01-20 15:10:45.531 226890 DEBUG oslo_concurrency.lockutils [None req-dba94263-ba84-490e-ac6c-1651b5983ad8 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "2a67c102-89d1-4196-bc8e-663656945547" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:46.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:47 np0005588920 podman[291935]: 2026-01-20 15:10:47.022928308 +0000 UTC m=+0.098335715 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:10:47 np0005588920 nova_compute[226886]: 2026-01-20 15:10:47.094 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:47 np0005588920 nova_compute[226886]: 2026-01-20 15:10:47.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:47.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:48.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:49.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:50 np0005588920 nova_compute[226886]: 2026-01-20 15:10:50.275 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588920 nova_compute[226886]: 2026-01-20 15:10:50.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:10:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2095857021' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:10:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:10:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:50.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:10:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:51.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:52 np0005588920 nova_compute[226886]: 2026-01-20 15:10:52.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:52.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:52 np0005588920 nova_compute[226886]: 2026-01-20 15:10:52.963 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:10:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:53.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:10:54 np0005588920 nova_compute[226886]: 2026-01-20 15:10:54.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:54 np0005588920 nova_compute[226886]: 2026-01-20 15:10:54.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:10:54 np0005588920 nova_compute[226886]: 2026-01-20 15:10:54.758 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:10:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:54.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:55.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:56.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:57 np0005588920 nova_compute[226886]: 2026-01-20 15:10:57.099 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:57 np0005588920 nova_compute[226886]: 2026-01-20 15:10:57.862 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921842.861576, 2a67c102-89d1-4196-bc8e-663656945547 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:57 np0005588920 nova_compute[226886]: 2026-01-20 15:10:57.862 226890 INFO nova.compute.manager [-] [instance: 2a67c102-89d1-4196-bc8e-663656945547] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:10:57 np0005588920 nova_compute[226886]: 2026-01-20 15:10:57.893 226890 DEBUG nova.compute.manager [None req-5f5a60a1-a0b1-4d38-8e57-a69b57abed96 - - - - - -] [instance: 2a67c102-89d1-4196-bc8e-663656945547] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:58.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.003 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.747 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:10:58 np0005588920 nova_compute[226886]: 2026-01-20 15:10:58.747 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:10:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:58.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:58 np0005588920 podman[291984]: 2026-01-20 15:10:58.968983268 +0000 UTC m=+0.051057728 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:10:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2215610935' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.189 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.367 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.369 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4205MB free_disk=20.94251251220703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.369 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.369 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.462 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.463 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.493 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2920973194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.954 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.960 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:59 np0005588920 nova_compute[226886]: 2026-01-20 15:10:59.981 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:11:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:00.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:00 np0005588920 nova_compute[226886]: 2026-01-20 15:11:00.017 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:11:00 np0005588920 nova_compute[226886]: 2026-01-20 15:11:00.017 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 20 10:11:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:00.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:01 np0005588920 nova_compute[226886]: 2026-01-20 15:11:01.017 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:01 np0005588920 nova_compute[226886]: 2026-01-20 15:11:01.019 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:01 np0005588920 nova_compute[226886]: 2026-01-20 15:11:01.019 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:02.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:02 np0005588920 nova_compute[226886]: 2026-01-20 15:11:02.100 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:02 np0005588920 nova_compute[226886]: 2026-01-20 15:11:02.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:02 np0005588920 nova_compute[226886]: 2026-01-20 15:11:02.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:02 np0005588920 nova_compute[226886]: 2026-01-20 15:11:02.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:11:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:02.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:03 np0005588920 nova_compute[226886]: 2026-01-20 15:11:03.005 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:03 np0005588920 nova_compute[226886]: 2026-01-20 15:11:03.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:03 np0005588920 nova_compute[226886]: 2026-01-20 15:11:03.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:11:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:04.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:11:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:04.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:06.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 20 10:11:06 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 20 10:11:06 np0005588920 nova_compute[226886]: 2026-01-20 15:11:06.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:06.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:07 np0005588920 nova_compute[226886]: 2026-01-20 15:11:07.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:08 np0005588920 nova_compute[226886]: 2026-01-20 15:11:08.007 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:08.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:08.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:09.220 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:09 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:09.221 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:11:09 np0005588920 nova_compute[226886]: 2026-01-20 15:11:09.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:10.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:10.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:12.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:12 np0005588920 nova_compute[226886]: 2026-01-20 15:11:12.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:12.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:13 np0005588920 nova_compute[226886]: 2026-01-20 15:11:13.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:14.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:14.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:15.223 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:16.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:16.471 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:16.471 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:16.471 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.589 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.589 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.608 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.681 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.682 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.688 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.688 226890 INFO nova.compute.claims [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:11:16 np0005588920 nova_compute[226886]: 2026-01-20 15:11:16.822 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.142 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2499990403' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.306 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.314 226890 DEBUG nova.compute.provider_tree [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.342 226890 DEBUG nova.scheduler.client.report [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.383 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.384 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.463 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.463 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.489 226890 INFO nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.533 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.604 226890 INFO nova.virt.block_device [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Booting with volume fd11049d-6334-4d6c-ac5d-8cfeca690b75 at /dev/vda#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.808 226890 DEBUG os_brick.utils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.809 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.820 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.821 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[470bea7d-6c46-43cc-9f57-6ebd7c19b7aa]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.822 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.830 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.830 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[832bd940-fecb-4f07-a94a-fce65478a59d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.832 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.840 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.840 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[641eba69-9332-4078-a62c-1564876d432e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.842 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[4640bab8-0412-4df2-8848-9d600ddafd0d]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.842 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.872 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.874 226890 DEBUG os_brick.initiator.connectors.lightos [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.875 226890 DEBUG os_brick.initiator.connectors.lightos [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.875 226890 DEBUG os_brick.initiator.connectors.lightos [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.875 226890 DEBUG os_brick.utils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:11:17 np0005588920 nova_compute[226886]: 2026-01-20 15:11:17.876 226890 DEBUG nova.virt.block_device [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating existing volume attachment record: 20ae1978-9442-4aff-9d5a-ea601c28b117 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:11:17 np0005588920 podman[292057]: 2026-01-20 15:11:17.981245611 +0000 UTC m=+0.074144580 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:11:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:18.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:18 np0005588920 nova_compute[226886]: 2026-01-20 15:11:18.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:18 np0005588920 nova_compute[226886]: 2026-01-20 15:11:18.261 226890 DEBUG nova.policy [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf422e55e158420cbdae75f07a3bb97a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a49638950e1543fa8e0d251af5479623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:11:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:18.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.003 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.005 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.006 226890 INFO nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Creating image(s)#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.007 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.008 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Ensure instance console log exists: /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.009 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.009 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.010 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:19 np0005588920 nova_compute[226886]: 2026-01-20 15:11:19.439 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Successfully created port: c9a63e6d-7cf9-437e-8c19-8faadd126360 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:11:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:11:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:11:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:11:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:20.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.352 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Successfully updated port: c9a63e6d-7cf9-437e-8c19-8faadd126360 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.370 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.370 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquired lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.371 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.503 226890 DEBUG nova.compute.manager [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-changed-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.503 226890 DEBUG nova.compute.manager [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Refreshing instance network info cache due to event network-changed-c9a63e6d-7cf9-437e-8c19-8faadd126360. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.503 226890 DEBUG oslo_concurrency.lockutils [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:20 np0005588920 nova_compute[226886]: 2026-01-20 15:11:20.533 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:11:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:20.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.598 226890 DEBUG nova.network.neutron [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating instance_info_cache with network_info: [{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.619 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Releasing lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.620 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Instance network_info: |[{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.620 226890 DEBUG oslo_concurrency.lockutils [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.621 226890 DEBUG nova.network.neutron [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Refreshing network info cache for port c9a63e6d-7cf9-437e-8c19-8faadd126360 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.626 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Start _get_guest_xml network_info=[{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': True, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-fd11049d-6334-4d6c-ac5d-8cfeca690b75', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'fd11049d-6334-4d6c-ac5d-8cfeca690b75', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '950f84b7-e9c2-415c-9946-315a443331c9', 'attached_at': '', 'detached_at': '', 'volume_id': 'fd11049d-6334-4d6c-ac5d-8cfeca690b75', 'serial': 'fd11049d-6334-4d6c-ac5d-8cfeca690b75'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '20ae1978-9442-4aff-9d5a-ea601c28b117', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.633 226890 WARNING nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.644 226890 DEBUG nova.virt.libvirt.host [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.644 226890 DEBUG nova.virt.libvirt.host [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.649 226890 DEBUG nova.virt.libvirt.host [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.649 226890 DEBUG nova.virt.libvirt.host [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.650 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.651 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.651 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.651 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.651 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.652 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.652 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.652 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.652 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.653 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.653 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.653 226890 DEBUG nova.virt.hardware [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.679 226890 DEBUG nova.storage.rbd_utils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 950f84b7-e9c2-415c-9946-315a443331c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:21 np0005588920 nova_compute[226886]: 2026-01-20 15:11:21.683 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:22.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1118456220' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.114 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.145 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.177 226890 DEBUG nova.virt.libvirt.vif [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-266549656',display_name='tempest-TestVolumeBootPattern-volume-backed-server-266549656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-266549656',id=176,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBuWqdSFN4v32AEaozku8d41vH/mRJUyoKyIct8fqpSkSqPno5OBW4av+JB51tPd+VsGiKVNTsGNsa05+ILtQUEp0CSB89twmlfd8wTE4PhvHzk2Ao6haoSQ8x9/IGME0Q==',key_name='tempest-keypair-1753747243',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-l9ug1paz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:11:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=950f84b7-e9c2-415c-9946-315a443331c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.177 226890 DEBUG nova.network.os_vif_util [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.178 226890 DEBUG nova.network.os_vif_util [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.180 226890 DEBUG nova.objects.instance [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'pci_devices' on Instance uuid 950f84b7-e9c2-415c-9946-315a443331c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.207 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <uuid>950f84b7-e9c2-415c-9946-315a443331c9</uuid>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <name>instance-000000b0</name>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestVolumeBootPattern-volume-backed-server-266549656</nova:name>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:11:21</nova:creationTime>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:user uuid="bf422e55e158420cbdae75f07a3bb97a">tempest-TestVolumeBootPattern-194644003-project-member</nova:user>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:project uuid="a49638950e1543fa8e0d251af5479623">tempest-TestVolumeBootPattern-194644003</nova:project>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <nova:port uuid="c9a63e6d-7cf9-437e-8c19-8faadd126360">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="serial">950f84b7-e9c2-415c-9946-315a443331c9</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="uuid">950f84b7-e9c2-415c-9946-315a443331c9</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/950f84b7-e9c2-415c-9946-315a443331c9_disk.config">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-fd11049d-6334-4d6c-ac5d-8cfeca690b75">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <serial>fd11049d-6334-4d6c-ac5d-8cfeca690b75</serial>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:41:97:ec"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <target dev="tapc9a63e6d-7c"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/console.log" append="off"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:11:22 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:11:22 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:11:22 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:11:22 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.208 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Preparing to wait for external event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.209 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.210 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.210 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.212 226890 DEBUG nova.virt.libvirt.vif [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-266549656',display_name='tempest-TestVolumeBootPattern-volume-backed-server-266549656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-266549656',id=176,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBuWqdSFN4v32AEaozku8d41vH/mRJUyoKyIct8fqpSkSqPno5OBW4av+JB51tPd+VsGiKVNTsGNsa05+ILtQUEp0CSB89twmlfd8wTE4PhvHzk2Ao6haoSQ8x9/IGME0Q==',key_name='tempest-keypair-1753747243',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-l9ug1paz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:11:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=950f84b7-e9c2-415c-9946-315a443331c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.212 226890 DEBUG nova.network.os_vif_util [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.213 226890 DEBUG nova.network.os_vif_util [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.214 226890 DEBUG os_vif [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.216 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.217 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.221 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9a63e6d-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.222 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9a63e6d-7c, col_values=(('external_ids', {'iface-id': 'c9a63e6d-7cf9-437e-8c19-8faadd126360', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:97:ec', 'vm-uuid': '950f84b7-e9c2-415c-9946-315a443331c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:22 np0005588920 NetworkManager[49076]: <info>  [1768921882.2252] manager: (tapc9a63e6d-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.233 226890 INFO os_vif [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c')#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.304 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.305 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.305 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No VIF found with MAC fa:16:3e:41:97:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.305 226890 INFO nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Using config drive#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.331 226890 DEBUG nova.storage.rbd_utils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 950f84b7-e9c2-415c-9946-315a443331c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.810 226890 INFO nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Creating config drive at /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.817 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq6h2ze8b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.952 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq6h2ze8b" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.989 226890 DEBUG nova.storage.rbd_utils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 950f84b7-e9c2-415c-9946-315a443331c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:22.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:22 np0005588920 nova_compute[226886]: 2026-01-20 15:11:22.993 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config 950f84b7-e9c2-415c-9946-315a443331c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.159 226890 DEBUG oslo_concurrency.processutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config 950f84b7-e9c2-415c-9946-315a443331c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.161 226890 INFO nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Deleting local config drive /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9/disk.config because it was imported into RBD.#033[00m
Jan 20 10:11:23 np0005588920 kernel: tapc9a63e6d-7c: entered promiscuous mode
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.2202] manager: (tapc9a63e6d-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/386)
Jan 20 10:11:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:23Z|00822|binding|INFO|Claiming lport c9a63e6d-7cf9-437e-8c19-8faadd126360 for this chassis.
Jan 20 10:11:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:23Z|00823|binding|INFO|c9a63e6d-7cf9-437e-8c19-8faadd126360: Claiming fa:16:3e:41:97:ec 10.100.0.6
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.223 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.239 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:97:ec 10.100.0.6'], port_security=['fa:16:3e:41:97:ec 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '950f84b7-e9c2-415c-9946-315a443331c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b29dba7-f5e9-43b9-8689-0b93b990ce82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c9a63e6d-7cf9-437e-8c19-8faadd126360) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.240 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c9a63e6d-7cf9-437e-8c19-8faadd126360 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 bound to our chassis#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.242 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b677f1a9-dbaa-4373-8466-bd9ccf067b91#033[00m
Jan 20 10:11:23 np0005588920 systemd-machined[196121]: New machine qemu-85-instance-000000b0.
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.255 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[198e90a3-2afd-4d1c-a489-3bff026de0cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.256 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb677f1a9-d1 in ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.258 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb677f1a9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.258 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[319a05f8-06a3-49d7-bd33-e3b39e447ca5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.260 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6b522b7a-00f0-4daa-988f-975a8ce4de76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.277 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[d554b769-b18f-45ee-a055-cf68ebf55df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 systemd[1]: Started Virtual Machine qemu-85-instance-000000b0.
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 systemd-udevd[292327]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:11:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:23Z|00824|binding|INFO|Setting lport c9a63e6d-7cf9-437e-8c19-8faadd126360 ovn-installed in OVS
Jan 20 10:11:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:23Z|00825|binding|INFO|Setting lport c9a63e6d-7cf9-437e-8c19-8faadd126360 up in Southbound
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.293 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.3008] device (tapc9a63e6d-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.3020] device (tapc9a63e6d-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.302 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[08f142c7-c5b9-468a-be0e-f6cca6ad3684]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.330 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d59f06-b06a-4d0e-96b8-38aeff95342d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.334 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8941f4-657f-4e05-a93b-c7f221bbd640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.3357] manager: (tapb677f1a9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/387)
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.366 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d32426a1-a474-4eb1-abd8-a423e77b1bfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.369 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[edba862b-8fa4-41dd-9d35-b486e6d9ba10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.3916] device (tapb677f1a9-d0): carrier: link connected
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.397 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3a2006-004c-4e23-9a98-30e7387b024f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.414 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bd9076-390c-4d97-9e66-7d60ca0f7fd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692507, 'reachable_time': 15576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292357, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.430 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0113f22f-ae73-46a1-a207-40a7ca26c3fc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:c834'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 692507, 'tstamp': 692507}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292358, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.447 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc6373b-bed0-4760-b65d-47a98c9d2790]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692507, 'reachable_time': 15576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 292359, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.480 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[025f8904-60ff-498e-b00f-37ad0fdcf931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.510 226890 DEBUG nova.network.neutron [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updated VIF entry in instance network info cache for port c9a63e6d-7cf9-437e-8c19-8faadd126360. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.510 226890 DEBUG nova.network.neutron [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating instance_info_cache with network_info: [{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.531 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[276a7f32-ad90-4ff4-9443-da3a3045d988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.532 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.533 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.533 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb677f1a9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.535 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 NetworkManager[49076]: <info>  [1768921883.5356] manager: (tapb677f1a9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Jan 20 10:11:23 np0005588920 kernel: tapb677f1a9-d0: entered promiscuous mode
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.537 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb677f1a9-d0, col_values=(('external_ids', {'iface-id': '1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:23 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:23Z|00826|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.538 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.540 226890 DEBUG oslo_concurrency.lockutils [req-cd2837e3-c19d-4ac0-93d0-da7d0e2d6267 req-89d7295c-4208-4f9f-84a6-f390e2ca03bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.554 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.555 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9270f4ae-e5ca-44f9-89dd-9c522a4d3dfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.555 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:11:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:11:23.557 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'env', 'PROCESS_TAG=haproxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b677f1a9-dbaa-4373-8466-bd9ccf067b91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.620 226890 DEBUG nova.compute.manager [req-010e1c9b-59ae-434e-9603-995d3ce9c70d req-44275694-a5ea-4ecd-81b9-35c2b0fe43ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.620 226890 DEBUG oslo_concurrency.lockutils [req-010e1c9b-59ae-434e-9603-995d3ce9c70d req-44275694-a5ea-4ecd-81b9-35c2b0fe43ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.621 226890 DEBUG oslo_concurrency.lockutils [req-010e1c9b-59ae-434e-9603-995d3ce9c70d req-44275694-a5ea-4ecd-81b9-35c2b0fe43ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.621 226890 DEBUG oslo_concurrency.lockutils [req-010e1c9b-59ae-434e-9603-995d3ce9c70d req-44275694-a5ea-4ecd-81b9-35c2b0fe43ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:23 np0005588920 nova_compute[226886]: 2026-01-20 15:11:23.621 226890 DEBUG nova.compute.manager [req-010e1c9b-59ae-434e-9603-995d3ce9c70d req-44275694-a5ea-4ecd-81b9-35c2b0fe43ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Processing event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:11:23 np0005588920 podman[292391]: 2026-01-20 15:11:23.913730317 +0000 UTC m=+0.049326948 container create 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:11:23 np0005588920 systemd[1]: Started libpod-conmon-887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de.scope.
Jan 20 10:11:23 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:11:23 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bec4fab7e0a405b732812f885e5ccb7e5e52485ab5d0d9c451a2c2759b1e8e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:11:23 np0005588920 podman[292391]: 2026-01-20 15:11:23.887003008 +0000 UTC m=+0.022599689 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:11:23 np0005588920 podman[292391]: 2026-01-20 15:11:23.985827236 +0000 UTC m=+0.121423917 container init 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:11:23 np0005588920 podman[292391]: 2026-01-20 15:11:23.990212394 +0000 UTC m=+0.125809045 container start 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:11:24 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [NOTICE]   (292452) : New worker (292455) forked
Jan 20 10:11:24 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [NOTICE]   (292452) : Loading success.
Jan 20 10:11:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:24.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.036 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921884.036335, 950f84b7-e9c2-415c-9946-315a443331c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.037 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] VM Started (Lifecycle Event)#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.038 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.041 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.043 226890 INFO nova.virt.libvirt.driver [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Instance spawned successfully.#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.044 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.058 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.064 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.068 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.069 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.069 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.069 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.070 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.070 226890 DEBUG nova.virt.libvirt.driver [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.093 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.093 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921884.0365953, 950f84b7-e9c2-415c-9946-315a443331c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.093 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.121 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.125 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921884.040627, 950f84b7-e9c2-415c-9946-315a443331c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.125 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.140 226890 INFO nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Took 5.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.141 226890 DEBUG nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.148 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.151 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.191 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.214 226890 INFO nova.compute.manager [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Took 7.56 seconds to build instance.#033[00m
Jan 20 10:11:24 np0005588920 nova_compute[226886]: 2026-01-20 15:11:24.231 226890 DEBUG oslo_concurrency.lockutils [None req-a40356cd-1d07-4d27-be32-aa0f4eb1ea70 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:24.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.699 226890 DEBUG nova.compute.manager [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.699 226890 DEBUG oslo_concurrency.lockutils [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.700 226890 DEBUG oslo_concurrency.lockutils [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.700 226890 DEBUG oslo_concurrency.lockutils [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.700 226890 DEBUG nova.compute.manager [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] No waiting events found dispatching network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:25 np0005588920 nova_compute[226886]: 2026-01-20 15:11:25.701 226890 WARNING nova.compute.manager [req-30b5c7b2-7709-41fa-8958-75bc948c49a9 req-7e1ec963-857f-47ef-8fab-c81b677feb6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received unexpected event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:11:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:26.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 20 10:11:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:26.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:27 np0005588920 nova_compute[226886]: 2026-01-20 15:11:27.146 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588920 nova_compute[226886]: 2026-01-20 15:11:27.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588920 nova_compute[226886]: 2026-01-20 15:11:27.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588920 NetworkManager[49076]: <info>  [1768921887.3468] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Jan 20 10:11:27 np0005588920 NetworkManager[49076]: <info>  [1768921887.3486] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Jan 20 10:11:27 np0005588920 nova_compute[226886]: 2026-01-20 15:11:27.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:27Z|00827|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:11:27 np0005588920 nova_compute[226886]: 2026-01-20 15:11:27.526 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:11:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:28.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:11:28 np0005588920 nova_compute[226886]: 2026-01-20 15:11:28.190 226890 DEBUG nova.compute.manager [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-changed-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:28 np0005588920 nova_compute[226886]: 2026-01-20 15:11:28.192 226890 DEBUG nova.compute.manager [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Refreshing instance network info cache due to event network-changed-c9a63e6d-7cf9-437e-8c19-8faadd126360. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:11:28 np0005588920 nova_compute[226886]: 2026-01-20 15:11:28.192 226890 DEBUG oslo_concurrency.lockutils [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:28 np0005588920 nova_compute[226886]: 2026-01-20 15:11:28.193 226890 DEBUG oslo_concurrency.lockutils [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:28 np0005588920 nova_compute[226886]: 2026-01-20 15:11:28.193 226890 DEBUG nova.network.neutron [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Refreshing network info cache for port c9a63e6d-7cf9-437e-8c19-8faadd126360 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:11:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:28.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:29 np0005588920 nova_compute[226886]: 2026-01-20 15:11:29.718 226890 DEBUG nova.network.neutron [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updated VIF entry in instance network info cache for port c9a63e6d-7cf9-437e-8c19-8faadd126360. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:11:29 np0005588920 nova_compute[226886]: 2026-01-20 15:11:29.719 226890 DEBUG nova.network.neutron [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating instance_info_cache with network_info: [{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:29 np0005588920 nova_compute[226886]: 2026-01-20 15:11:29.965 226890 DEBUG oslo_concurrency.lockutils [req-388f6fd8-608e-42f2-b406-dbd7f277c1b1 req-b969eab4-b6a6-46ae-b563-cfb83a8cbd20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:29 np0005588920 podman[292515]: 2026-01-20 15:11:29.978230926 +0000 UTC m=+0.054405136 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:11:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:30.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:30 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:30Z|00828|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:11:30 np0005588920 nova_compute[226886]: 2026-01-20 15:11:30.704 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:31.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:32.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:32 np0005588920 nova_compute[226886]: 2026-01-20 15:11:32.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:32 np0005588920 nova_compute[226886]: 2026-01-20 15:11:32.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:33.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:34.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:11:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:35.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:11:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:36.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1084543734' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:37.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:37Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:97:ec 10.100.0.6
Jan 20 10:11:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:11:37Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:97:ec 10.100.0.6
Jan 20 10:11:37 np0005588920 nova_compute[226886]: 2026-01-20 15:11:37.149 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:37 np0005588920 nova_compute[226886]: 2026-01-20 15:11:37.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:38.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:39.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:40.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:41.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:42.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:42 np0005588920 nova_compute[226886]: 2026-01-20 15:11:42.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:42 np0005588920 nova_compute[226886]: 2026-01-20 15:11:42.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:42 np0005588920 nova_compute[226886]: 2026-01-20 15:11:42.665 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:43.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:44.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:45.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 20 10:11:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:46.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 20 10:11:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:47.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:47 np0005588920 nova_compute[226886]: 2026-01-20 15:11:47.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:47 np0005588920 nova_compute[226886]: 2026-01-20 15:11:47.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:47 np0005588920 nova_compute[226886]: 2026-01-20 15:11:47.687 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:48.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:11:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:49.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:11:49 np0005588920 podman[292539]: 2026-01-20 15:11:49.047792667 +0000 UTC m=+0.130013287 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 10:11:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:51.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:52 np0005588920 nova_compute[226886]: 2026-01-20 15:11:52.155 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:52 np0005588920 nova_compute[226886]: 2026-01-20 15:11:52.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:53.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.929 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.930 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.930 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:11:54 np0005588920 nova_compute[226886]: 2026-01-20 15:11:54.931 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 950f84b7-e9c2-415c-9946-315a443331c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:56.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:11:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:57.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:11:57 np0005588920 nova_compute[226886]: 2026-01-20 15:11:57.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:57 np0005588920 nova_compute[226886]: 2026-01-20 15:11:57.233 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:58.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:11:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:59.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.367 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating instance_info_cache with network_info: [{"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.384 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-950f84b7-e9c2-415c-9946-315a443331c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.385 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.385 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.405 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.405 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.406 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.406 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.407 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2331614767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:59 np0005588920 nova_compute[226886]: 2026-01-20 15:11:59.958 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.028 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.028 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:12:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:00.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.198 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.199 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4011MB free_disk=20.937217712402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.199 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.200 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.277 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 950f84b7-e9c2-415c-9946-315a443331c9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.278 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.278 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.348 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:00 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1794335812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.794 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.799 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.819 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.852 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:12:00 np0005588920 nova_compute[226886]: 2026-01-20 15:12:00.853 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:00 np0005588920 podman[292612]: 2026-01-20 15:12:00.973103863 +0000 UTC m=+0.049745169 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:12:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:01.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:02.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.193 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.194 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.194 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:02 np0005588920 nova_compute[226886]: 2026-01-20 15:12:02.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:12:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:03 np0005588920 nova_compute[226886]: 2026-01-20 15:12:03.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:04.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:04 np0005588920 nova_compute[226886]: 2026-01-20 15:12:04.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:05.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:05 np0005588920 nova_compute[226886]: 2026-01-20 15:12:05.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:06.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:07.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:07 np0005588920 nova_compute[226886]: 2026-01-20 15:12:07.190 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:07 np0005588920 nova_compute[226886]: 2026-01-20 15:12:07.235 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:08.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:09.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:10.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:11.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:12 np0005588920 nova_compute[226886]: 2026-01-20 15:12:12.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:12 np0005588920 nova_compute[226886]: 2026-01-20 15:12:12.237 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:13.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:12:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1857185143' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:12:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:12:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1857185143' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:12:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:14.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:15.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:12:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:16.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:16.472 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:16.473 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:16.473 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:17.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:17 np0005588920 nova_compute[226886]: 2026-01-20 15:12:17.192 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:17 np0005588920 nova_compute[226886]: 2026-01-20 15:12:17.238 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:18.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:19.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:20 np0005588920 podman[292631]: 2026-01-20 15:12:20.039110713 +0000 UTC m=+0.112465936 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:12:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:20.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:21.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.595234) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941595468, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 1634, "num_deletes": 257, "total_data_size": 3389258, "memory_usage": 3443760, "flush_reason": "Manual Compaction"}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941610465, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 2233908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63521, "largest_seqno": 65150, "table_properties": {"data_size": 2227138, "index_size": 3776, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15019, "raw_average_key_size": 20, "raw_value_size": 2213210, "raw_average_value_size": 2970, "num_data_blocks": 166, "num_entries": 745, "num_filter_entries": 745, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921818, "oldest_key_time": 1768921818, "file_creation_time": 1768921941, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 15265 microseconds, and 5058 cpu microseconds.
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.610504) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 2233908 bytes OK
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.610527) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.611995) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612016) EVENT_LOG_v1 {"time_micros": 1768921941612004, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612036) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 3381691, prev total WAL file size 3381691, number of live WAL files 2.
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612904) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323730' seq:72057594037927935, type:22 .. '6C6F676D0032353231' seq:0, type:0; will stop at (end)
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(2181KB)], [126(11MB)]
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941612964, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 14489159, "oldest_snapshot_seqno": -1}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 9001 keys, 14341387 bytes, temperature: kUnknown
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941775628, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 14341387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14278953, "index_size": 38833, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22533, "raw_key_size": 235881, "raw_average_key_size": 26, "raw_value_size": 14116404, "raw_average_value_size": 1568, "num_data_blocks": 1503, "num_entries": 9001, "num_filter_entries": 9001, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921941, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.775921) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 14341387 bytes
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.788032) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.0 rd, 88.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.7 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(12.9) write-amplify(6.4) OK, records in: 9532, records dropped: 531 output_compression: NoCompression
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.788054) EVENT_LOG_v1 {"time_micros": 1768921941788044, "job": 80, "event": "compaction_finished", "compaction_time_micros": 162787, "compaction_time_cpu_micros": 31634, "output_level": 6, "num_output_files": 1, "total_output_size": 14341387, "num_input_records": 9532, "num_output_records": 9001, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941788561, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941790937, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612732) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.790981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.790985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.790987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.790989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:21.790990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:22.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:22 np0005588920 nova_compute[226886]: 2026-01-20 15:12:22.196 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:22 np0005588920 nova_compute[226886]: 2026-01-20 15:12:22.239 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:23.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:24.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:25.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:26.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:26 np0005588920 podman[292824]: 2026-01-20 15:12:26.493776654 +0000 UTC m=+0.076432247 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:12:26 np0005588920 podman[292824]: 2026-01-20 15:12:26.579463999 +0000 UTC m=+0.162119582 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 10:12:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:12:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:27.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:12:27 np0005588920 nova_compute[226886]: 2026-01-20 15:12:27.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:27 np0005588920 nova_compute[226886]: 2026-01-20 15:12:27.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:28.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:12:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:29 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:12:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:12:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:29.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:12:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:30.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.317 226890 DEBUG nova.compute.manager [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.473 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.474 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.510 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'pci_requests' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.537 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.537 226890 INFO nova.compute.claims [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.537 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'resources' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.552 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'numa_topology' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.570 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.634 226890 INFO nova.compute.resource_tracker [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating resource usage from migration fb286472-618a-41fb-a80d-8579de56af31#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.635 226890 DEBUG nova.compute.resource_tracker [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Starting to track incoming migration fb286472-618a-41fb-a80d-8579de56af31 with flavor 522deaab-a741-4dbb-932d-d8b13a211c33 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 10:12:30 np0005588920 nova_compute[226886]: 2026-01-20 15:12:30.743 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:31.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500870900' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:31 np0005588920 nova_compute[226886]: 2026-01-20 15:12:31.197 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:31 np0005588920 nova_compute[226886]: 2026-01-20 15:12:31.203 226890 DEBUG nova.compute.provider_tree [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:31 np0005588920 nova_compute[226886]: 2026-01-20 15:12:31.229 226890 DEBUG nova.scheduler.client.report [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:31 np0005588920 nova_compute[226886]: 2026-01-20 15:12:31.272 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:31 np0005588920 nova_compute[226886]: 2026-01-20 15:12:31.273 226890 INFO nova.compute.manager [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Migrating#033[00m
Jan 20 10:12:31 np0005588920 podman[293096]: 2026-01-20 15:12:31.992653962 +0000 UTC m=+0.071813532 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:12:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:32.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:32 np0005588920 nova_compute[226886]: 2026-01-20 15:12:32.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:32 np0005588920 nova_compute[226886]: 2026-01-20 15:12:32.242 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:12:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:33.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:12:33 np0005588920 nova_compute[226886]: 2026-01-20 15:12:33.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:33.317 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:12:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:33.318 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:12:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:34 np0005588920 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 10:12:34 np0005588920 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 10:12:34 np0005588920 systemd-logind[783]: New session 57 of user nova.
Jan 20 10:12:34 np0005588920 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 10:12:34 np0005588920 systemd[1]: Starting User Manager for UID 42436...
Jan 20 10:12:34 np0005588920 systemd[293121]: Queued start job for default target Main User Target.
Jan 20 10:12:34 np0005588920 systemd[293121]: Created slice User Application Slice.
Jan 20 10:12:34 np0005588920 systemd[293121]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:12:34 np0005588920 systemd[293121]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 10:12:34 np0005588920 systemd[293121]: Reached target Paths.
Jan 20 10:12:34 np0005588920 systemd[293121]: Reached target Timers.
Jan 20 10:12:34 np0005588920 systemd[293121]: Starting D-Bus User Message Bus Socket...
Jan 20 10:12:34 np0005588920 systemd[293121]: Starting Create User's Volatile Files and Directories...
Jan 20 10:12:34 np0005588920 systemd[293121]: Listening on D-Bus User Message Bus Socket.
Jan 20 10:12:34 np0005588920 systemd[293121]: Reached target Sockets.
Jan 20 10:12:34 np0005588920 systemd[293121]: Finished Create User's Volatile Files and Directories.
Jan 20 10:12:34 np0005588920 systemd[293121]: Reached target Basic System.
Jan 20 10:12:34 np0005588920 systemd[1]: Started User Manager for UID 42436.
Jan 20 10:12:34 np0005588920 systemd[293121]: Reached target Main User Target.
Jan 20 10:12:34 np0005588920 systemd[293121]: Startup finished in 165ms.
Jan 20 10:12:34 np0005588920 systemd[1]: Started Session 57 of User nova.
Jan 20 10:12:34 np0005588920 systemd-logind[783]: Session 57 logged out. Waiting for processes to exit.
Jan 20 10:12:34 np0005588920 systemd[1]: session-57.scope: Deactivated successfully.
Jan 20 10:12:34 np0005588920 systemd-logind[783]: Removed session 57.
Jan 20 10:12:34 np0005588920 systemd-logind[783]: New session 59 of user nova.
Jan 20 10:12:34 np0005588920 systemd[1]: Started Session 59 of User nova.
Jan 20 10:12:34 np0005588920 systemd[1]: session-59.scope: Deactivated successfully.
Jan 20 10:12:34 np0005588920 systemd-logind[783]: Session 59 logged out. Waiting for processes to exit.
Jan 20 10:12:34 np0005588920 systemd-logind[783]: Removed session 59.
Jan 20 10:12:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:35.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:35 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:36.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:37 np0005588920 nova_compute[226886]: 2026-01-20 15:12:37.247 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 20 10:12:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.320 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.434 226890 DEBUG nova.compute.manager [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.434 226890 DEBUG oslo_concurrency.lockutils [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.435 226890 DEBUG oslo_concurrency.lockutils [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.435 226890 DEBUG oslo_concurrency.lockutils [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.435 226890 DEBUG nova.compute.manager [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.436 226890 WARNING nova.compute.manager [req-2d728edf-5f04-4aaa-b80e-e171d9196a0b req-8c0a20cb-e8cf-4665-90a0-c4ca7cb7cdf3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received unexpected event network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.663 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.664 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.664 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.665 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.665 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.666 226890 INFO nova.compute.manager [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Terminating instance#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.668 226890 DEBUG nova.compute.manager [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:12:38 np0005588920 kernel: tapc9a63e6d-7c (unregistering): left promiscuous mode
Jan 20 10:12:38 np0005588920 NetworkManager[49076]: <info>  [1768921958.7509] device (tapc9a63e6d-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:12:38 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:38Z|00829|binding|INFO|Releasing lport c9a63e6d-7cf9-437e-8c19-8faadd126360 from this chassis (sb_readonly=0)
Jan 20 10:12:38 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:38Z|00830|binding|INFO|Setting lport c9a63e6d-7cf9-437e-8c19-8faadd126360 down in Southbound
Jan 20 10:12:38 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:38Z|00831|binding|INFO|Removing iface tapc9a63e6d-7c ovn-installed in OVS
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.763 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.770 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:97:ec 10.100.0.6'], port_security=['fa:16:3e:41:97:ec 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '950f84b7-e9c2-415c-9946-315a443331c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4b29dba7-f5e9-43b9-8689-0b93b990ce82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c9a63e6d-7cf9-437e-8c19-8faadd126360) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.772 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c9a63e6d-7cf9-437e-8c19-8faadd126360 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 unbound from our chassis#033[00m
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.773 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b677f1a9-dbaa-4373-8466-bd9ccf067b91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.775 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[73be2aaf-282b-4e3b-880c-e672893db055]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:38.775 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace which is not needed anymore#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:38 np0005588920 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 20 10:12:38 np0005588920 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b0.scope: Consumed 16.427s CPU time.
Jan 20 10:12:38 np0005588920 systemd-machined[196121]: Machine qemu-85-instance-000000b0 terminated.
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.914 226890 INFO nova.virt.libvirt.driver [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Instance destroyed successfully.#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.918 226890 DEBUG nova.objects.instance [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'resources' on Instance uuid 950f84b7-e9c2-415c-9946-315a443331c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.934 226890 DEBUG nova.virt.libvirt.vif [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:11:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-266549656',display_name='tempest-TestVolumeBootPattern-volume-backed-server-266549656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-266549656',id=176,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBuWqdSFN4v32AEaozku8d41vH/mRJUyoKyIct8fqpSkSqPno5OBW4av+JB51tPd+VsGiKVNTsGNsa05+ILtQUEp0CSB89twmlfd8wTE4PhvHzk2Ao6haoSQ8x9/IGME0Q==',key_name='tempest-keypair-1753747243',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:11:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-l9ug1paz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:11:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=950f84b7-e9c2-415c-9946-315a443331c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.934 226890 DEBUG nova.network.os_vif_util [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "address": "fa:16:3e:41:97:ec", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a63e6d-7c", "ovs_interfaceid": "c9a63e6d-7cf9-437e-8c19-8faadd126360", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.935 226890 DEBUG nova.network.os_vif_util [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.935 226890 DEBUG os_vif [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.938 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.938 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9a63e6d-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.939 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [NOTICE]   (292452) : haproxy version is 2.8.14-c23fe91
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [NOTICE]   (292452) : path to executable is /usr/sbin/haproxy
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [WARNING]  (292452) : Exiting Master process...
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [WARNING]  (292452) : Exiting Master process...
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.941 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [ALERT]    (292452) : Current worker (292455) exited with code 143 (Terminated)
Jan 20 10:12:38 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[292443]: [WARNING]  (292452) : All workers exited. Exiting... (0)
Jan 20 10:12:38 np0005588920 nova_compute[226886]: 2026-01-20 15:12:38.944 226890 INFO os_vif [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:97:ec,bridge_name='br-int',has_traffic_filtering=True,id=c9a63e6d-7cf9-437e-8c19-8faadd126360,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a63e6d-7c')#033[00m
Jan 20 10:12:38 np0005588920 systemd[1]: libpod-887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de.scope: Deactivated successfully.
Jan 20 10:12:38 np0005588920 podman[293218]: 2026-01-20 15:12:38.952930367 +0000 UTC m=+0.057939968 container died 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:12:38 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de-userdata-shm.mount: Deactivated successfully.
Jan 20 10:12:38 np0005588920 systemd[1]: var-lib-containers-storage-overlay-0bec4fab7e0a405b732812f885e5ccb7e5e52485ab5d0d9c451a2c2759b1e8e2-merged.mount: Deactivated successfully.
Jan 20 10:12:38 np0005588920 podman[293218]: 2026-01-20 15:12:38.99424035 +0000 UTC m=+0.099249941 container cleanup 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:12:39 np0005588920 systemd[1]: libpod-conmon-887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de.scope: Deactivated successfully.
Jan 20 10:12:39 np0005588920 podman[293274]: 2026-01-20 15:12:39.062381734 +0000 UTC m=+0.046164525 container remove 887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[efdb836c-2733-4677-91ca-a101e5a487a4]: (4, ('Tue Jan 20 03:12:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de)\n887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de\nTue Jan 20 03:12:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de)\n887a9e776562ea1532af43735b5cb860209bcad687b3754496e6adee2f25e5de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.072 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8ab442-f7e7-489f-8a55-ae59cce62603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.073 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:39 np0005588920 kernel: tapb677f1a9-d0: left promiscuous mode
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.087 226890 DEBUG nova.compute.manager [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-unplugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.088 226890 DEBUG oslo_concurrency.lockutils [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.089 226890 DEBUG oslo_concurrency.lockutils [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.089 226890 DEBUG oslo_concurrency.lockutils [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.090 226890 DEBUG nova.compute.manager [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] No waiting events found dispatching network-vif-unplugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.090 226890 DEBUG nova.compute.manager [req-3b10c1d8-979f-4180-8665-9db9f9955ac2 req-21c2fe29-3e82-4ff3-aba3-73cb07e216d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-unplugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.092 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[abfd46d8-452c-4685-89ce-adc488194967]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.107 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48a6b8a0-c623-45d9-bfac-88ce7e4f364a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.109 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0441dbf8-881a-4391-b611-934c1004b04e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.127 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8367d5-ddaa-4a6c-b3fa-d6d49d23c75e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 692501, 'reachable_time': 30002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293288, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 systemd[1]: run-netns-ovnmeta\x2db677f1a9\x2ddbaa\x2d4373\x2d8466\x2dbd9ccf067b91.mount: Deactivated successfully.
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.130 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:12:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:39.131 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[dbde3cad-f71f-4a00-81ae-4c6699234769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.162 226890 INFO nova.virt.libvirt.driver [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Deleting instance files /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9_del#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.163 226890 INFO nova.virt.libvirt.driver [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Deletion of /var/lib/nova/instances/950f84b7-e9c2-415c-9946-315a443331c9_del complete#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.294 226890 INFO nova.compute.manager [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Took 0.63 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.294 226890 DEBUG oslo.service.loopingcall [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.294 226890 DEBUG nova.compute.manager [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:12:39 np0005588920 nova_compute[226886]: 2026-01-20 15:12:39.295 226890 DEBUG nova.network.neutron [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:12:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.058 226890 INFO nova.network.neutron [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:12:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:40.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.555 226890 DEBUG nova.compute.manager [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.556 226890 DEBUG oslo_concurrency.lockutils [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.556 226890 DEBUG oslo_concurrency.lockutils [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.556 226890 DEBUG oslo_concurrency.lockutils [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.556 226890 DEBUG nova.compute.manager [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.556 226890 WARNING nova.compute.manager [req-b30fb206-28f5-44f1-8f3f-c1f28b5fd6e9 req-c71a8222-51b1-427b-9bae-e83229426101 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received unexpected event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.893 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.893 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquired lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:12:40 np0005588920 nova_compute[226886]: 2026-01-20 15:12:40.894 226890 DEBUG nova.network.neutron [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.296 226890 DEBUG nova.network.neutron [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.409 226890 DEBUG nova.compute.manager [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.410 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "950f84b7-e9c2-415c-9946-315a443331c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.410 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.410 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.410 226890 DEBUG nova.compute.manager [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] No waiting events found dispatching network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.411 226890 WARNING nova.compute.manager [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received unexpected event network-vif-plugged-c9a63e6d-7cf9-437e-8c19-8faadd126360 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.411 226890 DEBUG nova.compute.manager [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-changed-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.411 226890 DEBUG nova.compute.manager [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Refreshing instance network info cache due to event network-changed-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.411 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.413 226890 INFO nova.compute.manager [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Took 2.12 seconds to deallocate network for instance.#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.472 226890 DEBUG nova.compute.manager [req-351114a4-f523-4e87-a744-d432accddabe req-14ab1925-74cd-4a58-a5f2-de4874e7eee7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Received event network-vif-deleted-c9a63e6d-7cf9-437e-8c19-8faadd126360 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:41.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.786579) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961786643, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 540, "num_deletes": 252, "total_data_size": 825110, "memory_usage": 836008, "flush_reason": "Manual Compaction"}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961792088, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 472423, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65155, "largest_seqno": 65690, "table_properties": {"data_size": 469520, "index_size": 874, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7759, "raw_average_key_size": 21, "raw_value_size": 463546, "raw_average_value_size": 1266, "num_data_blocks": 36, "num_entries": 366, "num_filter_entries": 366, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921942, "oldest_key_time": 1768921942, "file_creation_time": 1768921961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 5540 microseconds, and 2116 cpu microseconds.
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.792128) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 472423 bytes OK
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.792145) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794237) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794252) EVENT_LOG_v1 {"time_micros": 1768921961794246, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794268) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 821918, prev total WAL file size 821918, number of live WAL files 2.
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794802) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303035' seq:72057594037927935, type:22 .. '6D6772737461740032323538' seq:0, type:0; will stop at (end)
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(461KB)], [129(13MB)]
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961794872, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 14813810, "oldest_snapshot_seqno": -1}
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.797 226890 INFO nova.compute.manager [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Took 0.38 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:12:41 np0005588920 nova_compute[226886]: 2026-01-20 15:12:41.799 226890 DEBUG nova.compute.manager [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Deleting volume: fd11049d-6334-4d6c-ac5d-8cfeca690b75 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8847 keys, 10986394 bytes, temperature: kUnknown
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961877591, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 10986394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10929539, "index_size": 33638, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22149, "raw_key_size": 232912, "raw_average_key_size": 26, "raw_value_size": 10774237, "raw_average_value_size": 1217, "num_data_blocks": 1287, "num_entries": 8847, "num_filter_entries": 8847, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768921961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.877977) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 10986394 bytes
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.879172) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.9 rd, 132.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 13.7 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(54.6) write-amplify(23.3) OK, records in: 9367, records dropped: 520 output_compression: NoCompression
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.879240) EVENT_LOG_v1 {"time_micros": 1768921961879190, "job": 82, "event": "compaction_finished", "compaction_time_micros": 82817, "compaction_time_cpu_micros": 36767, "output_level": 6, "num_output_files": 1, "total_output_size": 10986394, "num_input_records": 9367, "num_output_records": 8847, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961879550, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961884510, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.884607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.884615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.884618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.884620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:12:41.884623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:42.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.254 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.254 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.420 226890 DEBUG oslo_concurrency.processutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1046212052' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.887 226890 DEBUG oslo_concurrency.processutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.893 226890 DEBUG nova.compute.provider_tree [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:42 np0005588920 nova_compute[226886]: 2026-01-20 15:12:42.968 226890 DEBUG nova.scheduler.client.report [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.082 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.146 226890 INFO nova.scheduler.client.report [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Deleted allocations for instance 950f84b7-e9c2-415c-9946-315a443331c9#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.252 226890 DEBUG nova.network.neutron [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [{"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.299 226890 DEBUG oslo_concurrency.lockutils [None req-17b94ea7-7ac4-4c98-9c7f-7f3472dbfab7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "950f84b7-e9c2-415c-9946-315a443331c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.365 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Releasing lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.370 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.370 226890 DEBUG nova.network.neutron [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Refreshing network info cache for port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.487 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.489 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.489 226890 INFO nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Creating image(s)#033[00m
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.530 226890 DEBUG nova.storage.rbd_utils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] creating snapshot(nova-resize) on rbd image(f3b0f200-2f57-4c25-bdf4-8d17165642fe_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:12:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:43.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:43 np0005588920 nova_compute[226886]: 2026-01-20 15:12:43.940 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 20 10:12:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:44.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.151 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.285 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.286 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Ensure instance console log exists: /var/lib/nova/instances/f3b0f200-2f57-4c25-bdf4-8d17165642fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.286 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.287 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.287 226890 DEBUG oslo_concurrency.lockutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.290 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Start _get_guest_xml network_info=[{"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1896631991", "vif_mac": "fa:16:3e:ab:6b:07"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.297 226890 WARNING nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.302 226890 DEBUG nova.virt.libvirt.host [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.302 226890 DEBUG nova.virt.libvirt.host [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.306 226890 DEBUG nova.virt.libvirt.host [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.306 226890 DEBUG nova.virt.libvirt.host [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.307 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.307 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.308 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.308 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.308 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.309 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.309 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.309 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.310 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.310 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.310 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.310 226890 DEBUG nova.virt.hardware [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.311 226890 DEBUG nova.objects.instance [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.369 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1692019823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:44 np0005588920 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 10:12:44 np0005588920 systemd[293121]: Activating special unit Exit the Session...
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped target Main User Target.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped target Basic System.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped target Paths.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped target Sockets.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped target Timers.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 10:12:44 np0005588920 systemd[293121]: Closed D-Bus User Message Bus Socket.
Jan 20 10:12:44 np0005588920 systemd[293121]: Stopped Create User's Volatile Files and Directories.
Jan 20 10:12:44 np0005588920 systemd[293121]: Removed slice User Application Slice.
Jan 20 10:12:44 np0005588920 systemd[293121]: Reached target Shutdown.
Jan 20 10:12:44 np0005588920 systemd[293121]: Finished Exit the Session.
Jan 20 10:12:44 np0005588920 systemd[293121]: Reached target Exit the Session.
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.896 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:44 np0005588920 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 10:12:44 np0005588920 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 10:12:44 np0005588920 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 10:12:44 np0005588920 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 10:12:44 np0005588920 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 10:12:44 np0005588920 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 10:12:44 np0005588920 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 10:12:44 np0005588920 nova_compute[226886]: 2026-01-20 15:12:44.940 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.272 226890 DEBUG nova.network.neutron [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updated VIF entry in instance network info cache for port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.274 226890 DEBUG nova.network.neutron [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [{"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.295 226890 DEBUG oslo_concurrency.lockutils [req-6c603b11-14d3-40b9-96d7-246e412b2044 req-acda59cc-eb49-4a67-9909-d104426f4b05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:12:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/131253235' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.385 226890 DEBUG oslo_concurrency.processutils [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.387 226890 DEBUG nova.virt.libvirt.vif [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-755379148',display_name='tempest-TestNetworkAdvancedServerOps-server-755379148',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-755379148',id=178,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP0NB6AtDQr7I3Hp0XR7ulJzBFloX/ApnUaNnswWSYzrrT8mFzgvIiFRhCWLiZ+TDOJfVtcwGCfevRbqTmLZ5wdo4P6v9G2NYca0swLwaNQ/zK8Zmxz5PIdul2BRm2ICrw==',key_name='tempest-TestNetworkAdvancedServerOps-1554431653',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:12:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-j0wavald',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:12:39Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=f3b0f200-2f57-4c25-bdf4-8d17165642fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1896631991", "vif_mac": "fa:16:3e:ab:6b:07"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.387 226890 DEBUG nova.network.os_vif_util [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converting VIF {"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1896631991", "vif_mac": "fa:16:3e:ab:6b:07"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.389 226890 DEBUG nova.network.os_vif_util [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.392 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <uuid>f3b0f200-2f57-4c25-bdf4-8d17165642fe</uuid>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <name>instance-000000b2</name>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-755379148</nova:name>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:12:44</nova:creationTime>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <nova:port uuid="25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="serial">f3b0f200-2f57-4c25-bdf4-8d17165642fe</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="uuid">f3b0f200-2f57-4c25-bdf4-8d17165642fe</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f3b0f200-2f57-4c25-bdf4-8d17165642fe_disk">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/f3b0f200-2f57-4c25-bdf4-8d17165642fe_disk.config">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:ab:6b:07"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <target dev="tap25ad2c72-7d"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/f3b0f200-2f57-4c25-bdf4-8d17165642fe/console.log" append="off"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:12:45 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:12:45 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:12:45 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:12:45 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.394 226890 DEBUG nova.virt.libvirt.vif [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-755379148',display_name='tempest-TestNetworkAdvancedServerOps-server-755379148',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-755379148',id=178,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP0NB6AtDQr7I3Hp0XR7ulJzBFloX/ApnUaNnswWSYzrrT8mFzgvIiFRhCWLiZ+TDOJfVtcwGCfevRbqTmLZ5wdo4P6v9G2NYca0swLwaNQ/zK8Zmxz5PIdul2BRm2ICrw==',key_name='tempest-TestNetworkAdvancedServerOps-1554431653',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:12:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-j0wavald',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:12:39Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=f3b0f200-2f57-4c25-bdf4-8d17165642fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1896631991", "vif_mac": "fa:16:3e:ab:6b:07"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.395 226890 DEBUG nova.network.os_vif_util [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converting VIF {"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1896631991", "vif_mac": "fa:16:3e:ab:6b:07"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.395 226890 DEBUG nova.network.os_vif_util [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.396 226890 DEBUG os_vif [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.396 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.397 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.398 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.401 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25ad2c72-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.401 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25ad2c72-7d, col_values=(('external_ids', {'iface-id': '25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:6b:07', 'vm-uuid': 'f3b0f200-2f57-4c25-bdf4-8d17165642fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.403 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.4041] manager: (tap25ad2c72-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.406 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.410 226890 INFO os_vif [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d')#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.475 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.475 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.475 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] No VIF found with MAC fa:16:3e:ab:6b:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.476 226890 INFO nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Using config drive#033[00m
Jan 20 10:12:45 np0005588920 kernel: tap25ad2c72-7d: entered promiscuous mode
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.5522] manager: (tap25ad2c72-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/392)
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.553 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:45Z|00832|binding|INFO|Claiming lport 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for this chassis.
Jan 20 10:12:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:45Z|00833|binding|INFO|25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301: Claiming fa:16:3e:ab:6b:07 10.100.0.12
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.563 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:6b:07 10.100.0.12'], port_security=['fa:16:3e:ab:6b:07 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f3b0f200-2f57-4c25-bdf4-8d17165642fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3875f94a-ec8d-4588-90ca-c7ebe4dc6a1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a649097-9411-41ae-8903-e778937a7e59, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.564 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 in datapath ef6ea4cb-557a-4dec-844c-6c933ddba0b1 bound to our chassis#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.565 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef6ea4cb-557a-4dec-844c-6c933ddba0b1#033[00m
Jan 20 10:12:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:45Z|00834|binding|INFO|Setting lport 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 ovn-installed in OVS
Jan 20 10:12:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:45Z|00835|binding|INFO|Setting lport 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 up in Southbound
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.577 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c48bfc69-f9dc-4c9b-ad01-bdf709eea249]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.577 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef6ea4cb-51 in ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.579 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef6ea4cb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.579 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd074ed-8a58-4a38-82ef-533fa279683f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.580 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa11ff0-d8b0-4094-a13a-893442b64d10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 systemd-machined[196121]: New machine qemu-86-instance-000000b2.
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.592 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a57995-f29a-469a-8fa6-6907cac2e4b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 systemd[1]: Started Virtual Machine qemu-86-instance-000000b2.
Jan 20 10:12:45 np0005588920 systemd-udevd[293483]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.616 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fdeb63fd-8508-4df3-a7e2-7216f7507666]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.6229] device (tap25ad2c72-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.6235] device (tap25ad2c72-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.645 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a9ba19-4396-4d67-b511-8f49056798f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.6513] manager: (tapef6ea4cb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/393)
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.651 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e507499-0c99-45e5-bed8-a58623f98426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 systemd-udevd[293487]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.682 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[13a20a94-8023-4ebe-98cf-0f6cf710aed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.685 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[33bdbad0-6393-4dc3-bc08-d462c8fdd2f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.7174] device (tapef6ea4cb-50): carrier: link connected
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.725 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b22fdc-12c7-49a7-a6b3-625090578197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:45.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.744 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[24d51377-7b83-479f-b52c-538eea2f3d13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef6ea4cb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:ba:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700740, 'reachable_time': 23578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293517, 'error': None, 'target': 'ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.761 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[214252b3-56e1-450e-8237-0bc1e5e8815f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feac:bab8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 700740, 'tstamp': 700740}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293529, 'error': None, 'target': 'ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.780 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1873f2-b5db-48f9-a12b-69a6d3d4b3a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef6ea4cb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:ba:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700740, 'reachable_time': 23578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293533, 'error': None, 'target': 'ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.813 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2b63ab4c-3939-458e-9a9a-86177ba54c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.875 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[81440f8d-3ea6-4667-96d4-a0992cc6560b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.876 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef6ea4cb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.877 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.877 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef6ea4cb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.879 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 kernel: tapef6ea4cb-50: entered promiscuous mode
Jan 20 10:12:45 np0005588920 NetworkManager[49076]: <info>  [1768921965.8802] manager: (tapef6ea4cb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.881 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.883 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef6ea4cb-50, col_values=(('external_ids', {'iface-id': 'e83e13a6-6446-4245-ab88-80085510e40d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.884 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:45Z|00836|binding|INFO|Releasing lport e83e13a6-6446-4245-ab88-80085510e40d from this chassis (sb_readonly=0)
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.886 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef6ea4cb-557a-4dec-844c-6c933ddba0b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef6ea4cb-557a-4dec-844c-6c933ddba0b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.887 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[86286795-3c60-41a2-bcd8-f6a0e87dd95b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.887 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-ef6ea4cb-557a-4dec-844c-6c933ddba0b1
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/ef6ea4cb-557a-4dec-844c-6c933ddba0b1.pid.haproxy
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID ef6ea4cb-557a-4dec-844c-6c933ddba0b1
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:12:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:12:45.888 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'env', 'PROCESS_TAG=haproxy-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef6ea4cb-557a-4dec-844c-6c933ddba0b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.896 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921965.8961084, f3b0f200-2f57-4c25-bdf4-8d17165642fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.897 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.900 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.901 226890 DEBUG nova.compute.manager [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.905 226890 INFO nova.virt.libvirt.driver [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Instance running successfully.#033[00m
Jan 20 10:12:45 np0005588920 virtqemud[226436]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.907 226890 DEBUG nova.virt.libvirt.guest [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.907 226890 DEBUG nova.virt.libvirt.driver [None req-9e5c7bcf-e8da-46ec-8456-e937cc9e653a 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.918 226890 DEBUG nova.compute.manager [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.919 226890 DEBUG oslo_concurrency.lockutils [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.919 226890 DEBUG oslo_concurrency.lockutils [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.919 226890 DEBUG oslo_concurrency.lockutils [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.920 226890 DEBUG nova.compute.manager [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.920 226890 WARNING nova.compute.manager [req-54b4d772-0214-48fd-9fcc-d453e445096b req-412c24a1-2118-40af-8ec5-8eb9080fcc42 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received unexpected event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.934 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.943 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.989 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.989 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921965.8997102, f3b0f200-2f57-4c25-bdf4-8d17165642fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:45 np0005588920 nova_compute[226886]: 2026-01-20 15:12:45.990 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] VM Started (Lifecycle Event)#033[00m
Jan 20 10:12:46 np0005588920 nova_compute[226886]: 2026-01-20 15:12:46.028 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:46 np0005588920 nova_compute[226886]: 2026-01-20 15:12:46.031 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:46.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:46 np0005588920 podman[293586]: 2026-01-20 15:12:46.269755185 +0000 UTC m=+0.059160103 container create e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:12:46 np0005588920 systemd[1]: Started libpod-conmon-e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225.scope.
Jan 20 10:12:46 np0005588920 podman[293586]: 2026-01-20 15:12:46.246023674 +0000 UTC m=+0.035428612 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:12:46 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:12:46 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5e600ab71407c1380c296de3ada7f618645b4feccda54edd4adae3b94cdeb3c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:12:46 np0005588920 podman[293586]: 2026-01-20 15:12:46.363494045 +0000 UTC m=+0.152898983 container init e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 10:12:46 np0005588920 podman[293586]: 2026-01-20 15:12:46.374038452 +0000 UTC m=+0.163443370 container start e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:12:46 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [NOTICE]   (293605) : New worker (293607) forked
Jan 20 10:12:46 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [NOTICE]   (293605) : Loading success.
Jan 20 10:12:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 20 10:12:47 np0005588920 nova_compute[226886]: 2026-01-20 15:12:47.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:47.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e395 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.118 226890 DEBUG nova.compute.manager [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.119 226890 DEBUG oslo_concurrency.lockutils [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.119 226890 DEBUG oslo_concurrency.lockutils [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.119 226890 DEBUG oslo_concurrency.lockutils [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.119 226890 DEBUG nova.compute.manager [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:12:48 np0005588920 nova_compute[226886]: 2026-01-20 15:12:48.120 226890 WARNING nova.compute.manager [req-b7ad8032-2fa8-4c69-b375-829fc64f3b62 req-7f1c082c-01b5-44f0-8ad2-a6e4d2c57dfc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received unexpected event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:12:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:49.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:50.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:50 np0005588920 nova_compute[226886]: 2026-01-20 15:12:50.404 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 20 10:12:50 np0005588920 podman[293616]: 2026-01-20 15:12:50.992418959 +0000 UTC m=+0.079098504 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:12:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:51.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:52.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:52 np0005588920 nova_compute[226886]: 2026-01-20 15:12:52.391 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:53.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:53 np0005588920 nova_compute[226886]: 2026-01-20 15:12:53.912 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921958.9108305, 950f84b7-e9c2-415c-9946-315a443331c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:53 np0005588920 nova_compute[226886]: 2026-01-20 15:12:53.913 226890 INFO nova.compute.manager [-] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:12:53 np0005588920 nova_compute[226886]: 2026-01-20 15:12:53.964 226890 DEBUG nova.compute.manager [None req-f86d20af-f85e-4b72-bf0d-996030212c24 - - - - - -] [instance: 950f84b7-e9c2-415c-9946-315a443331c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:54.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:55 np0005588920 nova_compute[226886]: 2026-01-20 15:12:55.408 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:55.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:56 np0005588920 nova_compute[226886]: 2026-01-20 15:12:56.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:56 np0005588920 nova_compute[226886]: 2026-01-20 15:12:56.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:12:56 np0005588920 nova_compute[226886]: 2026-01-20 15:12:56.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:12:56 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 20 10:12:57 np0005588920 nova_compute[226886]: 2026-01-20 15:12:57.111 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:12:57 np0005588920 nova_compute[226886]: 2026-01-20 15:12:57.112 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:12:57 np0005588920 nova_compute[226886]: 2026-01-20 15:12:57.112 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:12:57 np0005588920 nova_compute[226886]: 2026-01-20 15:12:57.113 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:57 np0005588920 nova_compute[226886]: 2026-01-20 15:12:57.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:57.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:58.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:58 np0005588920 ovn_controller[133971]: 2026-01-20T15:12:58Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:6b:07 10.100.0.12
Jan 20 10:12:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:12:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:59.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:00.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.561 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [{"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.582 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.582 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.583 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.604 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.604 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.604 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.605 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:13:00 np0005588920 nova_compute[226886]: 2026-01-20 15:13:00.605 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2748790219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.135 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.211 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.211 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.365 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.367 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4031MB free_disk=20.942672729492188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.367 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.367 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.429 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance f3b0f200-2f57-4c25-bdf4-8d17165642fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.429 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.429 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.469 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:01.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:01 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/992957130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.898 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.905 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.932 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.958 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:13:01 np0005588920 nova_compute[226886]: 2026-01-20 15:13:01.958 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:02 np0005588920 nova_compute[226886]: 2026-01-20 15:13:02.100 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:02 np0005588920 nova_compute[226886]: 2026-01-20 15:13:02.100 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:02 np0005588920 nova_compute[226886]: 2026-01-20 15:13:02.100 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:02.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:02 np0005588920 nova_compute[226886]: 2026-01-20 15:13:02.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:02 np0005588920 podman[293688]: 2026-01-20 15:13:02.962585824 +0000 UTC m=+0.049700319 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:13:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:03.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:03 np0005588920 nova_compute[226886]: 2026-01-20 15:13:03.826 226890 INFO nova.compute.manager [None req-66b90b3c-41ef-498b-b835-d1639096689f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Get console output#033[00m
Jan 20 10:13:03 np0005588920 nova_compute[226886]: 2026-01-20 15:13:03.834 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:13:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:04.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:04 np0005588920 nova_compute[226886]: 2026-01-20 15:13:04.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:04 np0005588920 nova_compute[226886]: 2026-01-20 15:13:04.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:13:04 np0005588920 nova_compute[226886]: 2026-01-20 15:13:04.853 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.744 226890 DEBUG nova.compute.manager [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-changed-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.744 226890 DEBUG nova.compute.manager [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Refreshing instance network info cache due to event network-changed-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.744 226890 DEBUG oslo_concurrency.lockutils [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.744 226890 DEBUG oslo_concurrency.lockutils [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.745 226890 DEBUG nova.network.neutron [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Refreshing network info cache for port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:05.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.800 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.800 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.801 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.801 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.801 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.802 226890 INFO nova.compute.manager [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Terminating instance#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.803 226890 DEBUG nova.compute.manager [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:13:05 np0005588920 kernel: tap25ad2c72-7d (unregistering): left promiscuous mode
Jan 20 10:13:05 np0005588920 NetworkManager[49076]: <info>  [1768921985.8447] device (tap25ad2c72-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:13:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:05Z|00837|binding|INFO|Releasing lport 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 from this chassis (sb_readonly=0)
Jan 20 10:13:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:05Z|00838|binding|INFO|Setting lport 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 down in Southbound
Jan 20 10:13:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:05Z|00839|binding|INFO|Removing iface tap25ad2c72-7d ovn-installed in OVS
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.904 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.906 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:05.910 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:6b:07 10.100.0.12'], port_security=['fa:16:3e:ab:6b:07 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f3b0f200-2f57-4c25-bdf4-8d17165642fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3875f94a-ec8d-4588-90ca-c7ebe4dc6a1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a649097-9411-41ae-8903-e778937a7e59, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:05.912 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 in datapath ef6ea4cb-557a-4dec-844c-6c933ddba0b1 unbound from our chassis#033[00m
Jan 20 10:13:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:05.913 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef6ea4cb-557a-4dec-844c-6c933ddba0b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:13:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:05.914 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ba704dd3-164f-4dd3-ac7f-96c53a2bda78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:05.914 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1 namespace which is not needed anymore#033[00m
Jan 20 10:13:05 np0005588920 nova_compute[226886]: 2026-01-20 15:13:05.918 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588920 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Jan 20 10:13:05 np0005588920 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b2.scope: Consumed 12.506s CPU time.
Jan 20 10:13:05 np0005588920 systemd-machined[196121]: Machine qemu-86-instance-000000b2 terminated.
Jan 20 10:13:06 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [NOTICE]   (293605) : haproxy version is 2.8.14-c23fe91
Jan 20 10:13:06 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [NOTICE]   (293605) : path to executable is /usr/sbin/haproxy
Jan 20 10:13:06 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [WARNING]  (293605) : Exiting Master process...
Jan 20 10:13:06 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [ALERT]    (293605) : Current worker (293607) exited with code 143 (Terminated)
Jan 20 10:13:06 np0005588920 neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1[293601]: [WARNING]  (293605) : All workers exited. Exiting... (0)
Jan 20 10:13:06 np0005588920 systemd[1]: libpod-e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225.scope: Deactivated successfully.
Jan 20 10:13:06 np0005588920 podman[293729]: 2026-01-20 15:13:06.039705019 +0000 UTC m=+0.046398992 container died e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.040 226890 INFO nova.virt.libvirt.driver [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Instance destroyed successfully.#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.041 226890 DEBUG nova.objects.instance [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid f3b0f200-2f57-4c25-bdf4-8d17165642fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.055 226890 DEBUG nova.virt.libvirt.vif [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-755379148',display_name='tempest-TestNetworkAdvancedServerOps-server-755379148',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-755379148',id=178,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP0NB6AtDQr7I3Hp0XR7ulJzBFloX/ApnUaNnswWSYzrrT8mFzgvIiFRhCWLiZ+TDOJfVtcwGCfevRbqTmLZ5wdo4P6v9G2NYca0swLwaNQ/zK8Zmxz5PIdul2BRm2ICrw==',key_name='tempest-TestNetworkAdvancedServerOps-1554431653',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:12:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-j0wavald',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:12:51Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=f3b0f200-2f57-4c25-bdf4-8d17165642fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.056 226890 DEBUG nova.network.os_vif_util [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.057 226890 DEBUG nova.network.os_vif_util [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.057 226890 DEBUG os_vif [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.059 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.060 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ad2c72-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.061 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.064 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225-userdata-shm.mount: Deactivated successfully.
Jan 20 10:13:06 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d5e600ab71407c1380c296de3ada7f618645b4feccda54edd4adae3b94cdeb3c-merged.mount: Deactivated successfully.
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.070 226890 INFO os_vif [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:6b:07,bridge_name='br-int',has_traffic_filtering=True,id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301,network=Network(ef6ea4cb-557a-4dec-844c-6c933ddba0b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ad2c72-7d')#033[00m
Jan 20 10:13:06 np0005588920 podman[293729]: 2026-01-20 15:13:06.074136852 +0000 UTC m=+0.080830825 container cleanup e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 10:13:06 np0005588920 systemd[1]: libpod-conmon-e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225.scope: Deactivated successfully.
Jan 20 10:13:06 np0005588920 podman[293785]: 2026-01-20 15:13:06.138124305 +0000 UTC m=+0.042567330 container remove e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.146 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c5bd06bc-a3bc-4119-bc10-89098d6df990]: (4, ('Tue Jan 20 03:13:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1 (e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225)\ne9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225\nTue Jan 20 03:13:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1 (e9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225)\ne9b97d39fc63b95685d13fe3ace08026de797a9d3a57dfa927cdf9dcd3ac2225\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.148 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[97a03e09-2836-44ff-9e9e-89c3b6628fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.149 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef6ea4cb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:06 np0005588920 kernel: tapef6ea4cb-50: left promiscuous mode
Jan 20 10:13:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:06.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.166 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[40c37b2f-fbe2-4e10-807f-417a7d3d4b4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.191 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3170aa-4a74-459b-84bf-0c9ddd1f7ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.193 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[91c8ff74-e891-47ac-93b3-99cd92e0e932]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.211 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9aac19-a11d-42c0-aa17-332e4d88cf49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700732, 'reachable_time': 29764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293804, 'error': None, 'target': 'ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.214 226890 DEBUG nova.compute.manager [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.215 226890 DEBUG oslo_concurrency.lockutils [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.216 226890 DEBUG oslo_concurrency.lockutils [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.215 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef6ea4cb-557a-4dec-844c-6c933ddba0b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.216 226890 DEBUG oslo_concurrency.lockutils [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:06.215 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[97652472-5ea0-49f1-933f-61d1ab726331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.216 226890 DEBUG nova.compute.manager [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:06 np0005588920 systemd[1]: run-netns-ovnmeta\x2def6ea4cb\x2d557a\x2d4dec\x2d844c\x2d6c933ddba0b1.mount: Deactivated successfully.
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.216 226890 DEBUG nova.compute.manager [req-a88ae81f-e126-4d14-a3ae-467dbafcf58a req-d1a93287-76cc-4752-9fc4-b99240d83f5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-unplugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.456 226890 INFO nova.virt.libvirt.driver [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Deleting instance files /var/lib/nova/instances/f3b0f200-2f57-4c25-bdf4-8d17165642fe_del#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.456 226890 INFO nova.virt.libvirt.driver [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Deletion of /var/lib/nova/instances/f3b0f200-2f57-4c25-bdf4-8d17165642fe_del complete#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.512 226890 INFO nova.compute.manager [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.513 226890 DEBUG oslo.service.loopingcall [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.514 226890 DEBUG nova.compute.manager [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.514 226890 DEBUG nova.network.neutron [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:06 np0005588920 nova_compute[226886]: 2026-01-20 15:13:06.751 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.315 226890 DEBUG nova.network.neutron [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.331 226890 INFO nova.compute.manager [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Took 0.82 seconds to deallocate network for instance.#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.351 226890 DEBUG nova.network.neutron [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updated VIF entry in instance network info cache for port 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.352 226890 DEBUG nova.network.neutron [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [{"id": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "address": "fa:16:3e:ab:6b:07", "network": {"id": "ef6ea4cb-557a-4dec-844c-6c933ddba0b1", "bridge": "br-int", "label": "tempest-network-smoke--1896631991", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ad2c72-7d", "ovs_interfaceid": "25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.397 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.398 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.411 226890 DEBUG oslo_concurrency.lockutils [req-7253649a-97d7-4baf-b03d-65ef45ba3920 req-0ae8d675-64bf-47e4-bb47-03125ed31649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f3b0f200-2f57-4c25-bdf4-8d17165642fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.469 226890 DEBUG oslo_concurrency.processutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.751 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:07.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.834 226890 DEBUG nova.compute.manager [req-e752f275-931e-4450-93aa-c61b7a104608 req-002fc58b-d377-45d2-822f-dbb3fbb3975e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-deleted-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.835 226890 INFO nova.compute.manager [req-e752f275-931e-4450-93aa-c61b7a104608 req-002fc58b-d377-45d2-822f-dbb3fbb3975e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Neutron deleted interface 25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.836 226890 DEBUG nova.network.neutron [req-e752f275-931e-4450-93aa-c61b7a104608 req-002fc58b-d377-45d2-822f-dbb3fbb3975e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.859 226890 DEBUG nova.compute.manager [req-e752f275-931e-4450-93aa-c61b7a104608 req-002fc58b-d377-45d2-822f-dbb3fbb3975e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Detach interface failed, port_id=25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301, reason: Instance f3b0f200-2f57-4c25-bdf4-8d17165642fe could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 10:13:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2234610671' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.944 226890 DEBUG oslo_concurrency.processutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.949 226890 DEBUG nova.compute.provider_tree [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:07 np0005588920 nova_compute[226886]: 2026-01-20 15:13:07.977 226890 DEBUG nova.scheduler.client.report [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.007 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.031 226890 INFO nova.scheduler.client.report [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance f3b0f200-2f57-4c25-bdf4-8d17165642fe#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.109 226890 DEBUG oslo_concurrency.lockutils [None req-a7b5edf1-bfe2-4b99-b4c4-35425485a919 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:08.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.308 226890 DEBUG nova.compute.manager [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.308 226890 DEBUG oslo_concurrency.lockutils [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.308 226890 DEBUG oslo_concurrency.lockutils [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.309 226890 DEBUG oslo_concurrency.lockutils [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3b0f200-2f57-4c25-bdf4-8d17165642fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.309 226890 DEBUG nova.compute.manager [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] No waiting events found dispatching network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.309 226890 WARNING nova.compute.manager [req-4dbb4b8a-6b46-4cc7-a180-fc31eadace6c req-faa2726f-52e6-4366-aa38-f930a49d7272 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Received unexpected event network-vif-plugged-25ad2c72-7d4d-4eb9-bf00-a5c42aa9d301 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.451 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.452 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.475 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.538 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.539 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.545 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.545 226890 INFO nova.compute.claims [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.682 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:08 np0005588920 nova_compute[226886]: 2026-01-20 15:13:08.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/221289373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.100 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.106 226890 DEBUG nova.compute.provider_tree [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.127 226890 DEBUG nova.scheduler.client.report [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.151 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.151 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.205 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.206 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.229 226890 INFO nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.267 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.326 226890 INFO nova.virt.block_device [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Booting with volume 670aaaab-6a81-487d-a346-d03d445d8abe at /dev/vda#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.456 226890 DEBUG nova.policy [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf422e55e158420cbdae75f07a3bb97a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a49638950e1543fa8e0d251af5479623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.483 226890 DEBUG os_brick.utils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.484 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.504 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.504 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7b2463-32a4-4da0-81ac-caecf7658f52]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.505 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.519 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.520 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f0049c-f662-4731-a23a-b946c45dd9d3]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.521 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.535 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.535 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8d296620-07d8-4eaa-bfea-60ff2d52ee59]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.536 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[8b12d512-5d1f-4391-90b8-7441fff7332f]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.537 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.583 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "nvme version" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.585 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.586 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.586 226890 DEBUG os_brick.initiator.connectors.lightos [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.586 226890 DEBUG os_brick.utils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] <== get_connector_properties: return (103ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:13:09 np0005588920 nova_compute[226886]: 2026-01-20 15:13:09.587 226890 DEBUG nova.virt.block_device [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updating existing volume attachment record: c84052e2-80a5-4b07-a114-b125b84e1168 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:13:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:13:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:09.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:13:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:10.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.164 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Successfully created port: c453c6fa-f968-46b1-ae72-cd74d3c7dc02 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.640 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.642 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.642 226890 INFO nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Creating image(s)#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.643 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.643 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Ensure instance console log exists: /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.643 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.644 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.644 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.865 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Successfully updated port: c453c6fa-f968-46b1-ae72-cd74d3c7dc02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.885 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.885 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquired lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.886 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.969 226890 DEBUG nova.compute.manager [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-changed-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.970 226890 DEBUG nova.compute.manager [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Refreshing instance network info cache due to event network-changed-c453c6fa-f968-46b1-ae72-cd74d3c7dc02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:10 np0005588920 nova_compute[226886]: 2026-01-20 15:13:10.970 226890 DEBUG oslo_concurrency.lockutils [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:11 np0005588920 nova_compute[226886]: 2026-01-20 15:13:11.064 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:11 np0005588920 nova_compute[226886]: 2026-01-20 15:13:11.070 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:13:11 np0005588920 nova_compute[226886]: 2026-01-20 15:13:11.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:11 np0005588920 nova_compute[226886]: 2026-01-20 15:13:11.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:11.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:12.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.417 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.478 226890 DEBUG nova.network.neutron [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updating instance_info_cache with network_info: [{"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.499 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Releasing lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.499 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Instance network_info: |[{"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.499 226890 DEBUG oslo_concurrency.lockutils [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.500 226890 DEBUG nova.network.neutron [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Refreshing network info cache for port c453c6fa-f968-46b1-ae72-cd74d3c7dc02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.502 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Start _get_guest_xml network_info=[{"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-670aaaab-6a81-487d-a346-d03d445d8abe', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '670aaaab-6a81-487d-a346-d03d445d8abe', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '389ff8e0-c114-4960-9561-f6ffef743efa', 'attached_at': '', 'detached_at': '', 'volume_id': '670aaaab-6a81-487d-a346-d03d445d8abe', 'serial': '670aaaab-6a81-487d-a346-d03d445d8abe'}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': 'c84052e2-80a5-4b07-a114-b125b84e1168', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.507 226890 WARNING nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.511 226890 DEBUG nova.virt.libvirt.host [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.512 226890 DEBUG nova.virt.libvirt.host [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.516 226890 DEBUG nova.virt.libvirt.host [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.516 226890 DEBUG nova.virt.libvirt.host [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.518 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.518 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.518 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.519 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.519 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.519 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.519 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.519 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.520 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.520 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.520 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.520 226890 DEBUG nova.virt.hardware [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.542 226890 DEBUG nova.storage.rbd_utils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 389ff8e0-c114-4960-9561-f6ffef743efa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.546 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:12 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2024585432' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:12 np0005588920 nova_compute[226886]: 2026-01-20 15:13:12.970 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.021 226890 DEBUG nova.virt.libvirt.vif [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-894482498',display_name='tempest-TestVolumeBootPattern-server-894482498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-894482498',id=181,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-m1u5zzf4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:09Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=389ff8e0-c114-4960-9561-f6ffef743efa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.022 226890 DEBUG nova.network.os_vif_util [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.024 226890 DEBUG nova.network.os_vif_util [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.027 226890 DEBUG nova.objects.instance [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'pci_devices' on Instance uuid 389ff8e0-c114-4960-9561-f6ffef743efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.047 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <uuid>389ff8e0-c114-4960-9561-f6ffef743efa</uuid>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <name>instance-000000b5</name>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestVolumeBootPattern-server-894482498</nova:name>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:13:12</nova:creationTime>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:user uuid="bf422e55e158420cbdae75f07a3bb97a">tempest-TestVolumeBootPattern-194644003-project-member</nova:user>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:project uuid="a49638950e1543fa8e0d251af5479623">tempest-TestVolumeBootPattern-194644003</nova:project>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <nova:port uuid="c453c6fa-f968-46b1-ae72-cd74d3c7dc02">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="serial">389ff8e0-c114-4960-9561-f6ffef743efa</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="uuid">389ff8e0-c114-4960-9561-f6ffef743efa</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/389ff8e0-c114-4960-9561-f6ffef743efa_disk.config">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-670aaaab-6a81-487d-a346-d03d445d8abe">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <serial>670aaaab-6a81-487d-a346-d03d445d8abe</serial>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:57:1b:b8"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <target dev="tapc453c6fa-f9"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/console.log" append="off"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:13:13 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:13:13 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:13:13 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:13:13 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.049 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Preparing to wait for external event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.049 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.050 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.050 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.051 226890 DEBUG nova.virt.libvirt.vif [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-894482498',display_name='tempest-TestVolumeBootPattern-server-894482498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-894482498',id=181,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-m1u5zzf4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:09Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=389ff8e0-c114-4960-9561-f6ffef743efa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.051 226890 DEBUG nova.network.os_vif_util [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.051 226890 DEBUG nova.network.os_vif_util [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.052 226890 DEBUG os_vif [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.052 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.053 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.053 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.056 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.057 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc453c6fa-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.057 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc453c6fa-f9, col_values=(('external_ids', {'iface-id': 'c453c6fa-f968-46b1-ae72-cd74d3c7dc02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:1b:b8', 'vm-uuid': '389ff8e0-c114-4960-9561-f6ffef743efa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:13 np0005588920 NetworkManager[49076]: <info>  [1768921993.1064] manager: (tapc453c6fa-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.105 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.114 226890 INFO os_vif [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9')#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.176 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.177 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.177 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No VIF found with MAC fa:16:3e:57:1b:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.178 226890 INFO nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Using config drive#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.205 226890 DEBUG nova.storage.rbd_utils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 389ff8e0-c114-4960-9561-f6ffef743efa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:13:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2794425772' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:13:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:13:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2794425772' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.624 226890 INFO nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Creating config drive at /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.630 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0rcm9j5q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.776 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0rcm9j5q" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:13.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.799 226890 DEBUG nova.storage.rbd_utils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 389ff8e0-c114-4960-9561-f6ffef743efa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.802 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config 389ff8e0-c114-4960-9561-f6ffef743efa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.975 226890 DEBUG nova.network.neutron [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updated VIF entry in instance network info cache for port c453c6fa-f968-46b1-ae72-cd74d3c7dc02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.976 226890 DEBUG nova.network.neutron [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updating instance_info_cache with network_info: [{"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.992 226890 DEBUG oslo_concurrency.processutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config 389ff8e0-c114-4960-9561-f6ffef743efa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.993 226890 INFO nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Deleting local config drive /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa/disk.config because it was imported into RBD.#033[00m
Jan 20 10:13:13 np0005588920 nova_compute[226886]: 2026-01-20 15:13:13.996 226890 DEBUG oslo_concurrency.lockutils [req-39a0cc4d-9e8c-413a-8ba2-f73d0ef0a022 req-5e512d3f-af3d-47c7-a904-364e5888f6a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:14 np0005588920 kernel: tapc453c6fa-f9: entered promiscuous mode
Jan 20 10:13:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:14Z|00840|binding|INFO|Claiming lport c453c6fa-f968-46b1-ae72-cd74d3c7dc02 for this chassis.
Jan 20 10:13:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:14Z|00841|binding|INFO|c453c6fa-f968-46b1-ae72-cd74d3c7dc02: Claiming fa:16:3e:57:1b:b8 10.100.0.3
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.0466] manager: (tapc453c6fa-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/396)
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.046 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.056 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:1b:b8 10.100.0.3'], port_security=['fa:16:3e:57:1b:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '389ff8e0-c114-4960-9561-f6ffef743efa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c29da5ec-6cb2-4047-ba89-70fa67a96476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c453c6fa-f968-46b1-ae72-cd74d3c7dc02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.057 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c453c6fa-f968-46b1-ae72-cd74d3c7dc02 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 bound to our chassis#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.058 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b677f1a9-dbaa-4373-8466-bd9ccf067b91#033[00m
Jan 20 10:13:14 np0005588920 systemd-udevd[293971]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.071 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f416f6f7-3cc0-4f5f-9b6b-9e47d93586d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.072 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb677f1a9-d1 in ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.074 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb677f1a9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.074 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[efe40f70-52aa-44fc-8c7e-217375eddb06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.074 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9294b5bf-b4fc-417f-93b0-312a9037f4c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 systemd-machined[196121]: New machine qemu-87-instance-000000b5.
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.086 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d83da4-f2b6-4c23-8d32-5bf61b76e64c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.0891] device (tapc453c6fa-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.0898] device (tapc453c6fa-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 systemd[1]: Started Virtual Machine qemu-87-instance-000000b5.
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.112 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9696f7b3-577b-4a76-8be0-0dc0003fdf83]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:14Z|00842|binding|INFO|Setting lport c453c6fa-f968-46b1-ae72-cd74d3c7dc02 ovn-installed in OVS
Jan 20 10:13:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:14Z|00843|binding|INFO|Setting lport c453c6fa-f968-46b1-ae72-cd74d3c7dc02 up in Southbound
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.119 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.152 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9e54c4-c4fc-4363-b199-19c98716c417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 systemd-udevd[293975]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.1591] manager: (tapb677f1a9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/397)
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.158 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b297ea9e-e84e-41a5-9a9e-f5697c8fda83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:14.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.192 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[67335ddb-b5c8-4e99-8a6d-db508983ecc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.195 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[307a3da9-0f92-4396-8b72-b348a15e1ae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.2195] device (tapb677f1a9-d0): carrier: link connected
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.223 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9c3da2-1444-4a11-b8d4-26db99eb356c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.240 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6aed006e-f4ed-49ab-9393-a2b5d12b2d4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 703590, 'reachable_time': 33517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294004, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.256 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[850e8627-d325-47dc-84e8-6de3fd7b4b0e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:c834'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 703590, 'tstamp': 703590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294005, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.273 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a267e43f-6613-424c-8896-e081780a5742]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 703590, 'reachable_time': 33517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294006, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.301 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[656d77f1-c2ba-4e3c-9780-45e345b85fbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.356 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c3865c-f929-4ec9-8851-90825d4ae4a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.358 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.358 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.358 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb677f1a9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.406 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 kernel: tapb677f1a9-d0: entered promiscuous mode
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 NetworkManager[49076]: <info>  [1768921994.4125] manager: (tapb677f1a9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.413 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb677f1a9-d0, col_values=(('external_ids', {'iface-id': '1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.415 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:14Z|00844|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.416 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.417 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3f7b06-9dd2-4dde-99aa-b0c8be8dae4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.418 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:13:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:14.419 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'env', 'PROCESS_TAG=haproxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b677f1a9-dbaa-4373-8466-bd9ccf067b91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.428 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.544 226890 DEBUG nova.compute.manager [req-8fc8601e-7278-431f-9bc5-ac8baf88baba req-0b9f0c96-2fda-4fe2-9970-b0084dda1995 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.545 226890 DEBUG oslo_concurrency.lockutils [req-8fc8601e-7278-431f-9bc5-ac8baf88baba req-0b9f0c96-2fda-4fe2-9970-b0084dda1995 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.545 226890 DEBUG oslo_concurrency.lockutils [req-8fc8601e-7278-431f-9bc5-ac8baf88baba req-0b9f0c96-2fda-4fe2-9970-b0084dda1995 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.545 226890 DEBUG oslo_concurrency.lockutils [req-8fc8601e-7278-431f-9bc5-ac8baf88baba req-0b9f0c96-2fda-4fe2-9970-b0084dda1995 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.545 226890 DEBUG nova.compute.manager [req-8fc8601e-7278-431f-9bc5-ac8baf88baba req-0b9f0c96-2fda-4fe2-9970-b0084dda1995 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Processing event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.702 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921994.7021496, 389ff8e0-c114-4960-9561-f6ffef743efa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.703 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] VM Started (Lifecycle Event)#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.704 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.709 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.713 226890 INFO nova.virt.libvirt.driver [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Instance spawned successfully.#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.714 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.740 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.745 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.745 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.746 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.746 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.746 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.747 226890 DEBUG nova.virt.libvirt.driver [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.750 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.788 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.789 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921994.7023578, 389ff8e0-c114-4960-9561-f6ffef743efa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.789 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:13:14 np0005588920 podman[294080]: 2026-01-20 15:13:14.801057321 +0000 UTC m=+0.056223328 container create 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.806 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.810 226890 INFO nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Took 4.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.810 226890 DEBUG nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.812 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768921994.707237, 389ff8e0-c114-4960-9561-f6ffef743efa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.813 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.838 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.840 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:14 np0005588920 systemd[1]: Started libpod-conmon-101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9.scope.
Jan 20 10:13:14 np0005588920 podman[294080]: 2026-01-20 15:13:14.772629103 +0000 UTC m=+0.027795130 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.868 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:14 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:13:14 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/187446bd27bae25bde7c05fdfe068635431d83f727e53f56f9107578c538690e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.889 226890 INFO nova.compute.manager [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Took 6.37 seconds to build instance.#033[00m
Jan 20 10:13:14 np0005588920 nova_compute[226886]: 2026-01-20 15:13:14.911 226890 DEBUG oslo_concurrency.lockutils [None req-c40cad4f-88b1-42cf-a9df-1351563488dd bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:14 np0005588920 podman[294080]: 2026-01-20 15:13:14.970413983 +0000 UTC m=+0.225580010 container init 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:13:14 np0005588920 podman[294080]: 2026-01-20 15:13:14.97682054 +0000 UTC m=+0.231986587 container start 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:13:15 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [NOTICE]   (294098) : New worker (294100) forked
Jan 20 10:13:15 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [NOTICE]   (294098) : Loading success.
Jan 20 10:13:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:15.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:16.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:16.473 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:16.477 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:16.479 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.685 226890 DEBUG nova.compute.manager [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.686 226890 DEBUG oslo_concurrency.lockutils [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.686 226890 DEBUG oslo_concurrency.lockutils [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.686 226890 DEBUG oslo_concurrency.lockutils [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.686 226890 DEBUG nova.compute.manager [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] No waiting events found dispatching network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:16 np0005588920 nova_compute[226886]: 2026-01-20 15:13:16.687 226890 WARNING nova.compute.manager [req-4a64b566-d197-4d3f-8911-5bb961d32977 req-dcc0e855-c3ad-427a-b723-5021628505ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received unexpected event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:13:17 np0005588920 nova_compute[226886]: 2026-01-20 15:13:17.418 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:17.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:18 np0005588920 nova_compute[226886]: 2026-01-20 15:13:18.107 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:18.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:18 np0005588920 nova_compute[226886]: 2026-01-20 15:13:18.639 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:18 np0005588920 NetworkManager[49076]: <info>  [1768921998.6422] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/399)
Jan 20 10:13:18 np0005588920 NetworkManager[49076]: <info>  [1768921998.6434] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/400)
Jan 20 10:13:18 np0005588920 ceph-osd[79820]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 20 10:13:18 np0005588920 nova_compute[226886]: 2026-01-20 15:13:18.772 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:18 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:18Z|00845|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:13:18 np0005588920 nova_compute[226886]: 2026-01-20 15:13:18.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:19 np0005588920 nova_compute[226886]: 2026-01-20 15:13:19.465 226890 DEBUG nova.compute.manager [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-changed-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:19 np0005588920 nova_compute[226886]: 2026-01-20 15:13:19.466 226890 DEBUG nova.compute.manager [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Refreshing instance network info cache due to event network-changed-c453c6fa-f968-46b1-ae72-cd74d3c7dc02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:19 np0005588920 nova_compute[226886]: 2026-01-20 15:13:19.466 226890 DEBUG oslo_concurrency.lockutils [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:19 np0005588920 nova_compute[226886]: 2026-01-20 15:13:19.466 226890 DEBUG oslo_concurrency.lockutils [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:19 np0005588920 nova_compute[226886]: 2026-01-20 15:13:19.467 226890 DEBUG nova.network.neutron [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Refreshing network info cache for port c453c6fa-f968-46b1-ae72-cd74d3c7dc02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:19.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:20.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.036 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921986.0350227, f3b0f200-2f57-4c25-bdf4-8d17165642fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.037 226890 INFO nova.compute.manager [-] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.360 226890 DEBUG nova.compute.manager [None req-9483e4df-c18f-4e3e-a4b5-be5ee89d68be - - - - - -] [instance: f3b0f200-2f57-4c25-bdf4-8d17165642fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.487 226890 DEBUG nova.network.neutron [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updated VIF entry in instance network info cache for port c453c6fa-f968-46b1-ae72-cd74d3c7dc02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.487 226890 DEBUG nova.network.neutron [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updating instance_info_cache with network_info: [{"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:21 np0005588920 nova_compute[226886]: 2026-01-20 15:13:21.518 226890 DEBUG oslo_concurrency.lockutils [req-cce1c411-3c95-4098-9520-b6764788e2f2 req-cdc66655-6088-4aef-b70a-b1635a1802fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-389ff8e0-c114-4960-9561-f6ffef743efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:21.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:22 np0005588920 podman[294110]: 2026-01-20 15:13:22.033087594 +0000 UTC m=+0.115720551 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:13:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:22.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:22 np0005588920 nova_compute[226886]: 2026-01-20 15:13:22.480 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:23 np0005588920 nova_compute[226886]: 2026-01-20 15:13:23.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:23 np0005588920 nova_compute[226886]: 2026-01-20 15:13:23.109 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:23.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:24.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:25 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 10:13:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:25.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:26.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:26Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:1b:b8 10.100.0.3
Jan 20 10:13:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:26Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:1b:b8 10.100.0.3
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.402594) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007402670, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 771, "num_deletes": 254, "total_data_size": 1275267, "memory_usage": 1297664, "flush_reason": "Manual Compaction"}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007410088, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 840017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65695, "largest_seqno": 66461, "table_properties": {"data_size": 836370, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8828, "raw_average_key_size": 19, "raw_value_size": 828875, "raw_average_value_size": 1871, "num_data_blocks": 63, "num_entries": 443, "num_filter_entries": 443, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921962, "oldest_key_time": 1768921962, "file_creation_time": 1768922007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 7540 microseconds, and 3270 cpu microseconds.
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.410138) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 840017 bytes OK
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.410153) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.411737) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.411751) EVENT_LOG_v1 {"time_micros": 1768922007411745, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.411765) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 1271164, prev total WAL file size 1271164, number of live WAL files 2.
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.412264) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(820KB)], [132(10MB)]
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007412314, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 11826411, "oldest_snapshot_seqno": -1}
Jan 20 10:13:27 np0005588920 nova_compute[226886]: 2026-01-20 15:13:27.482 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8768 keys, 9951508 bytes, temperature: kUnknown
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007496902, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 9951508, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9896161, "index_size": 32322, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21957, "raw_key_size": 232025, "raw_average_key_size": 26, "raw_value_size": 9743160, "raw_average_value_size": 1111, "num_data_blocks": 1224, "num_entries": 8768, "num_filter_entries": 8768, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.497358) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 9951508 bytes
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.499355) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.6 rd, 117.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 10.5 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(25.9) write-amplify(11.8) OK, records in: 9290, records dropped: 522 output_compression: NoCompression
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.499380) EVENT_LOG_v1 {"time_micros": 1768922007499368, "job": 84, "event": "compaction_finished", "compaction_time_micros": 84723, "compaction_time_cpu_micros": 24792, "output_level": 6, "num_output_files": 1, "total_output_size": 9951508, "num_input_records": 9290, "num_output_records": 8768, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007499747, "job": 84, "event": "table_file_deletion", "file_number": 134}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007502343, "job": 84, "event": "table_file_deletion", "file_number": 132}
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.412179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:27.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:28 np0005588920 nova_compute[226886]: 2026-01-20 15:13:28.111 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:28.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:29.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:31.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:32.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:32 np0005588920 nova_compute[226886]: 2026-01-20 15:13:32.484 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:33 np0005588920 nova_compute[226886]: 2026-01-20 15:13:33.114 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:33.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:33 np0005588920 podman[294140]: 2026-01-20 15:13:33.990137775 +0000 UTC m=+0.075460699 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:13:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:34.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.774 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.775 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.776 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.776 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.776 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.777 226890 INFO nova.compute.manager [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Terminating instance#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.778 226890 DEBUG nova.compute.manager [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:13:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:35.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:35 np0005588920 kernel: tapc453c6fa-f9 (unregistering): left promiscuous mode
Jan 20 10:13:35 np0005588920 NetworkManager[49076]: <info>  [1768922015.8318] device (tapc453c6fa-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:13:35 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:35Z|00846|binding|INFO|Releasing lport c453c6fa-f968-46b1-ae72-cd74d3c7dc02 from this chassis (sb_readonly=0)
Jan 20 10:13:35 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:35Z|00847|binding|INFO|Setting lport c453c6fa-f968-46b1-ae72-cd74d3c7dc02 down in Southbound
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.874 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:35 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:35Z|00848|binding|INFO|Removing iface tapc453c6fa-f9 ovn-installed in OVS
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.878 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:35.882 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:1b:b8 10.100.0.3'], port_security=['fa:16:3e:57:1b:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '389ff8e0-c114-4960-9561-f6ffef743efa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c29da5ec-6cb2-4047-ba89-70fa67a96476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=c453c6fa-f968-46b1-ae72-cd74d3c7dc02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:35.883 144128 INFO neutron.agent.ovn.metadata.agent [-] Port c453c6fa-f968-46b1-ae72-cd74d3c7dc02 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 unbound from our chassis#033[00m
Jan 20 10:13:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:35.884 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b677f1a9-dbaa-4373-8466-bd9ccf067b91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:13:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:35.885 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45a20dab-010e-4695-aac0-b70568178205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:35 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:35.885 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace which is not needed anymore#033[00m
Jan 20 10:13:35 np0005588920 nova_compute[226886]: 2026-01-20 15:13:35.895 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:35 np0005588920 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 20 10:13:35 np0005588920 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b5.scope: Consumed 13.021s CPU time.
Jan 20 10:13:35 np0005588920 systemd-machined[196121]: Machine qemu-87-instance-000000b5 terminated.
Jan 20 10:13:36 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [NOTICE]   (294098) : haproxy version is 2.8.14-c23fe91
Jan 20 10:13:36 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [NOTICE]   (294098) : path to executable is /usr/sbin/haproxy
Jan 20 10:13:36 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [WARNING]  (294098) : Exiting Master process...
Jan 20 10:13:36 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [ALERT]    (294098) : Current worker (294100) exited with code 143 (Terminated)
Jan 20 10:13:36 np0005588920 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[294094]: [WARNING]  (294098) : All workers exited. Exiting... (0)
Jan 20 10:13:36 np0005588920 systemd[1]: libpod-101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9.scope: Deactivated successfully.
Jan 20 10:13:36 np0005588920 podman[294314]: 2026-01-20 15:13:36.012974011 +0000 UTC m=+0.043953891 container died 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.014 226890 INFO nova.virt.libvirt.driver [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Instance destroyed successfully.#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.015 226890 DEBUG nova.objects.instance [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'resources' on Instance uuid 389ff8e0-c114-4960-9561-f6ffef743efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.027 226890 DEBUG nova.virt.libvirt.vif [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-894482498',display_name='tempest-TestVolumeBootPattern-server-894482498',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-894482498',id=181,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:13:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-m1u5zzf4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:13:14Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=389ff8e0-c114-4960-9561-f6ffef743efa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.027 226890 DEBUG nova.network.os_vif_util [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "address": "fa:16:3e:57:1b:b8", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc453c6fa-f9", "ovs_interfaceid": "c453c6fa-f968-46b1-ae72-cd74d3c7dc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.028 226890 DEBUG nova.network.os_vif_util [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.029 226890 DEBUG os_vif [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.033 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.034 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc453c6fa-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.036 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.037 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:36 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9-userdata-shm.mount: Deactivated successfully.
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.043 226890 INFO os_vif [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:1b:b8,bridge_name='br-int',has_traffic_filtering=True,id=c453c6fa-f968-46b1-ae72-cd74d3c7dc02,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc453c6fa-f9')#033[00m
Jan 20 10:13:36 np0005588920 systemd[1]: var-lib-containers-storage-overlay-187446bd27bae25bde7c05fdfe068635431d83f727e53f56f9107578c538690e-merged.mount: Deactivated successfully.
Jan 20 10:13:36 np0005588920 podman[294314]: 2026-01-20 15:13:36.050833373 +0000 UTC m=+0.081813253 container cleanup 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:13:36 np0005588920 systemd[1]: libpod-conmon-101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9.scope: Deactivated successfully.
Jan 20 10:13:36 np0005588920 podman[294367]: 2026-01-20 15:13:36.108571984 +0000 UTC m=+0.037003168 container remove 101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.115 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[62027b32-4d2f-4c30-aece-aa903485fe72]: (4, ('Tue Jan 20 03:13:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9)\n101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9\nTue Jan 20 03:13:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9)\n101b6185e50cc0ea2802594fd4e6f0d3156a37f8520a107586fabbd8f55a1bd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.116 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[533b81d3-4103-4dea-a176-f94896c58655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.117 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:36 np0005588920 kernel: tapb677f1a9-d0: left promiscuous mode
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.119 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.134 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5153cebb-ad19-431a-bff9-57588b1a891f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.148 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d03af53d-aefc-428f-90c6-84810eb9a016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.150 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[75986c47-9ed4-4dd4-ab46-dedf47df42a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.166 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[128042a7-f58c-477c-91ac-8621702ecae4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 703583, 'reachable_time': 30095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294386, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 systemd[1]: run-netns-ovnmeta\x2db677f1a9\x2ddbaa\x2d4373\x2d8466\x2dbd9ccf067b91.mount: Deactivated successfully.
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.169 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:13:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:36.169 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbc4b6e-9a28-4728-ae84-18229ddc1965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:36.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.254 226890 INFO nova.virt.libvirt.driver [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Deleting instance files /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa_del#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.254 226890 INFO nova.virt.libvirt.driver [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Deletion of /var/lib/nova/instances/389ff8e0-c114-4960-9561-f6ffef743efa_del complete#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.346 226890 INFO nova.compute.manager [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.347 226890 DEBUG oslo.service.loopingcall [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.348 226890 DEBUG nova.compute.manager [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.348 226890 DEBUG nova.network.neutron [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:13:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:13:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.452 226890 DEBUG nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-unplugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.452 226890 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.452 226890 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.452 226890 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.453 226890 DEBUG nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] No waiting events found dispatching network-vif-unplugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:36 np0005588920 nova_compute[226886]: 2026-01-20 15:13:36.453 226890 DEBUG nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-unplugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.128 226890 DEBUG nova.network.neutron [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.151 226890 INFO nova.compute.manager [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Took 0.80 seconds to deallocate network for instance.#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.486 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.500 226890 DEBUG nova.compute.manager [req-41466812-7d4c-43f4-8339-79dab34429ed req-d01fd61b-676c-4662-a02b-d4fd7c81139c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-deleted-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:37.511 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.511 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:37.512 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.642 226890 INFO nova.compute.manager [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Took 0.49 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.683 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.684 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:37 np0005588920 nova_compute[226886]: 2026-01-20 15:13:37.754 226890 DEBUG oslo_concurrency.processutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:37.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/152969632' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:38.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.198 226890 DEBUG oslo_concurrency.processutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.203 226890 DEBUG nova.compute.provider_tree [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.223 226890 DEBUG nova.scheduler.client.report [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.248 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.269 226890 INFO nova.scheduler.client.report [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Deleted allocations for instance 389ff8e0-c114-4960-9561-f6ffef743efa#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.338 226890 DEBUG oslo_concurrency.lockutils [None req-55e75d44-ae5d-4016-81f4-b4edb7c269d7 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.533 226890 DEBUG nova.compute.manager [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.533 226890 DEBUG oslo_concurrency.lockutils [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.534 226890 DEBUG oslo_concurrency.lockutils [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.534 226890 DEBUG oslo_concurrency.lockutils [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "389ff8e0-c114-4960-9561-f6ffef743efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.534 226890 DEBUG nova.compute.manager [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] No waiting events found dispatching network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:38 np0005588920 nova_compute[226886]: 2026-01-20 15:13:38.534 226890 WARNING nova.compute.manager [req-ea504aea-9948-4a76-ba7d-5abd2b8a9f73 req-9d2592cb-aa40-40a3-966b-d56a1567a3f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Received unexpected event network-vif-plugged-c453c6fa-f968-46b1-ae72-cd74d3c7dc02 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:13:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:39.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.154 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.154 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.168 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:13:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:40.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.234 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.234 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.239 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.239 226890 INFO nova.compute.claims [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.349 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:40.514 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:40 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:40 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/40024272' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.812 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.817 226890 DEBUG nova.compute.provider_tree [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.832 226890 DEBUG nova.scheduler.client.report [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.855 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.855 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.894 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.895 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.910 226890 INFO nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:13:40 np0005588920 nova_compute[226886]: 2026-01-20 15:13:40.925 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.022 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.024 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.024 226890 INFO nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Creating image(s)#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.054 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.086 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.114 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.117 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.144 226890 DEBUG nova.policy [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9cc4ce3e069479ba9c789b378a68a1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fff727019f86407498e83d7948d54962', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.181 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.183 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.183 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.184 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.210 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.215 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 810e72a9-536d-4214-956b-9d5216cce8ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.489 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 810e72a9-536d-4214-956b-9d5216cce8ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.573 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] resizing rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:13:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.670 226890 DEBUG nova.objects.instance [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'migration_context' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.682 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.683 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Ensure instance console log exists: /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.683 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.683 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:41 np0005588920 nova_compute[226886]: 2026-01-20 15:13:41.684 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.035 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Successfully created port: 0c07d11d-c06a-497a-9dbd-975adce07e97 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:13:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:13:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:42.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.796 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Successfully updated port: 0c07d11d-c06a-497a-9dbd-975adce07e97 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.809 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.809 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.809 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:13:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.870 226890 DEBUG nova.compute.manager [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.871 226890 DEBUG nova.compute.manager [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing instance network info cache due to event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.871 226890 DEBUG oslo_concurrency.lockutils [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:42 np0005588920 nova_compute[226886]: 2026-01-20 15:13:42.930 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.762 226890 DEBUG nova.network.neutron [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.808 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.808 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Instance network_info: |[{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.808 226890 DEBUG oslo_concurrency.lockutils [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.809 226890 DEBUG nova.network.neutron [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.811 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Start _get_guest_xml network_info=[{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.816 226890 WARNING nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.821 226890 DEBUG nova.virt.libvirt.host [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.822 226890 DEBUG nova.virt.libvirt.host [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.828 226890 DEBUG nova.virt.libvirt.host [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.829 226890 DEBUG nova.virt.libvirt.host [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.830 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.830 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.830 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.830 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.831 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.831 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.831 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.831 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.831 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.832 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.832 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.832 226890 DEBUG nova.virt.hardware [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:13:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:43.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:43 np0005588920 nova_compute[226886]: 2026-01-20 15:13:43.835 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:44.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/418648228' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.249 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.283 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.290 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1877180579' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.738 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.740 226890 DEBUG nova.virt.libvirt.vif [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=184,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-fpkdlq3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=810e72a9-536d-4214-956b-9d5216cce8ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.740 226890 DEBUG nova.network.os_vif_util [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.741 226890 DEBUG nova.network.os_vif_util [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.742 226890 DEBUG nova.objects.instance [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_devices' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.756 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <uuid>810e72a9-536d-4214-956b-9d5216cce8ff</uuid>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <name>instance-000000b8</name>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:name>multiattach-server-0</nova:name>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:13:43</nova:creationTime>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:user uuid="e9cc4ce3e069479ba9c789b378a68a1d">tempest-AttachVolumeMultiAttachTest-418194625-project-member</nova:user>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:project uuid="fff727019f86407498e83d7948d54962">tempest-AttachVolumeMultiAttachTest-418194625</nova:project>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <nova:port uuid="0c07d11d-c06a-497a-9dbd-975adce07e97">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="serial">810e72a9-536d-4214-956b-9d5216cce8ff</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="uuid">810e72a9-536d-4214-956b-9d5216cce8ff</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/810e72a9-536d-4214-956b-9d5216cce8ff_disk">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/810e72a9-536d-4214-956b-9d5216cce8ff_disk.config">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:cc:d4:e6"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <target dev="tap0c07d11d-c0"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/console.log" append="off"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:13:44 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:13:44 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:13:44 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:13:44 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.757 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Preparing to wait for external event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.758 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.758 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.758 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.759 226890 DEBUG nova.virt.libvirt.vif [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=184,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-fpkdlq3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=810e72a9-536d-4214-956b-9d5216cce8ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.759 226890 DEBUG nova.network.os_vif_util [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.760 226890 DEBUG nova.network.os_vif_util [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.760 226890 DEBUG os_vif [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.761 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.761 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.761 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.764 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.764 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c07d11d-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.765 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c07d11d-c0, col_values=(('external_ids', {'iface-id': '0c07d11d-c06a-497a-9dbd-975adce07e97', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:d4:e6', 'vm-uuid': '810e72a9-536d-4214-956b-9d5216cce8ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:44 np0005588920 NetworkManager[49076]: <info>  [1768922024.7667] manager: (tap0c07d11d-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.768 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.770 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.771 226890 INFO os_vif [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0')#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.900 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.900 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.901 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:cc:d4:e6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.901 226890 INFO nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Using config drive#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.932 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.991 226890 DEBUG nova.network.neutron [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updated VIF entry in instance network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:44 np0005588920 nova_compute[226886]: 2026-01-20 15:13:44.991 226890 DEBUG nova.network.neutron [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.151 226890 DEBUG oslo_concurrency.lockutils [req-a2878848-cc1b-4f91-b28d-04b8d229c4d4 req-c8c2a812-9ce8-4a54-8de4-ff61fc938ee4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.486 226890 INFO nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Creating config drive at /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.492 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphgxe7_06 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.625 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphgxe7_06" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.653 226890 DEBUG nova.storage.rbd_utils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 810e72a9-536d-4214-956b-9d5216cce8ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.657 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config 810e72a9-536d-4214-956b-9d5216cce8ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.817 226890 DEBUG oslo_concurrency.processutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config 810e72a9-536d-4214-956b-9d5216cce8ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.819 226890 INFO nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Deleting local config drive /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff/disk.config because it was imported into RBD.#033[00m
Jan 20 10:13:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:45.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:45 np0005588920 kernel: tap0c07d11d-c0: entered promiscuous mode
Jan 20 10:13:45 np0005588920 NetworkManager[49076]: <info>  [1768922025.8678] manager: (tap0c07d11d-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/402)
Jan 20 10:13:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:45Z|00849|binding|INFO|Claiming lport 0c07d11d-c06a-497a-9dbd-975adce07e97 for this chassis.
Jan 20 10:13:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:45Z|00850|binding|INFO|0c07d11d-c06a-497a-9dbd-975adce07e97: Claiming fa:16:3e:cc:d4:e6 10.100.0.14
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.874 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:d4:e6 10.100.0.14'], port_security=['fa:16:3e:cc:d4:e6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '810e72a9-536d-4214-956b-9d5216cce8ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0c07d11d-c06a-497a-9dbd-975adce07e97) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.875 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0c07d11d-c06a-497a-9dbd-975adce07e97 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.876 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:13:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:45Z|00851|binding|INFO|Setting lport 0c07d11d-c06a-497a-9dbd-975adce07e97 ovn-installed in OVS
Jan 20 10:13:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:45Z|00852|binding|INFO|Setting lport 0c07d11d-c06a-497a-9dbd-975adce07e97 up in Southbound
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.886 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.886 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e23802c-1a9a-4cee-9edc-428f2646e327]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.887 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1f4a971-01 in ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:13:45 np0005588920 nova_compute[226886]: 2026-01-20 15:13:45.890 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.889 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1f4a971-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.889 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e44b6a48-2781-4f1a-813a-7ca5c4a1fc19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.891 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6c59a3f4-9c2b-4447-b5e9-8fce6ec2c991]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 systemd-machined[196121]: New machine qemu-88-instance-000000b8.
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.903 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2ea64c-9661-420a-96d4-6ada16e16f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 systemd[1]: Started Virtual Machine qemu-88-instance-000000b8.
Jan 20 10:13:45 np0005588920 systemd-udevd[294787]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.924 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4960df1a-7275-427b-98b3-73be2db0abfb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 NetworkManager[49076]: <info>  [1768922025.9326] device (tap0c07d11d-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:13:45 np0005588920 NetworkManager[49076]: <info>  [1768922025.9337] device (tap0c07d11d-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.952 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb45e1c-699b-4ca9-b7bb-05fdffc0560d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 systemd-udevd[294790]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:45 np0005588920 NetworkManager[49076]: <info>  [1768922025.9583] manager: (tapc1f4a971-00): new Veth device (/org/freedesktop/NetworkManager/Devices/403)
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.957 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a6eda4e2-b75d-4e28-88ce-b5c31f671e91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.991 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd3fc9e-db80-4ac4-9fb0-368b236c21b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:45.994 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d349c47a-bb2e-40bb-b762-dc375cd9165e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 NetworkManager[49076]: <info>  [1768922026.0141] device (tapc1f4a971-00): carrier: link connected
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.019 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2b1756-3e5b-4940-badf-fde05e15f3f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.035 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7c19b9e5-9e09-451b-bd1d-25d39c4c6e0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294817, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.048 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7c24d95c-e718-43e9-a3b9-5c43c6a4637b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:30f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706770, 'tstamp': 706770}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294818, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.065 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ac13df73-71fd-4754-9079-5d1b28f7e8c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294819, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.085 226890 DEBUG nova.compute.manager [req-51e7dd8e-17db-4922-a2a5-86b900386cfe req-b430b5f8-6c3a-4dd8-8a43-f76f84d97cc4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.086 226890 DEBUG oslo_concurrency.lockutils [req-51e7dd8e-17db-4922-a2a5-86b900386cfe req-b430b5f8-6c3a-4dd8-8a43-f76f84d97cc4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.086 226890 DEBUG oslo_concurrency.lockutils [req-51e7dd8e-17db-4922-a2a5-86b900386cfe req-b430b5f8-6c3a-4dd8-8a43-f76f84d97cc4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.086 226890 DEBUG oslo_concurrency.lockutils [req-51e7dd8e-17db-4922-a2a5-86b900386cfe req-b430b5f8-6c3a-4dd8-8a43-f76f84d97cc4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.086 226890 DEBUG nova.compute.manager [req-51e7dd8e-17db-4922-a2a5-86b900386cfe req-b430b5f8-6c3a-4dd8-8a43-f76f84d97cc4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Processing event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.096 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[467bdbe7-9d75-4998-b46f-cf0d6416490f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.156 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd1249e-fff4-44b6-bb9b-76688d542a30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.157 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.158 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.158 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:46 np0005588920 kernel: tapc1f4a971-00: entered promiscuous mode
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:46 np0005588920 NetworkManager[49076]: <info>  [1768922026.1606] manager: (tapc1f4a971-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.165 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.166 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:13:46Z|00853|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.169 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.170 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[817470a7-2b39-42ae-a79f-9df81aefc109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.171 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:13:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:13:46.172 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'env', 'PROCESS_TAG=haproxy-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.179 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:46.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:46 np0005588920 podman[294887]: 2026-01-20 15:13:46.551026019 +0000 UTC m=+0.059577006 container create 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:13:46 np0005588920 systemd[1]: Started libpod-conmon-83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58.scope.
Jan 20 10:13:46 np0005588920 podman[294887]: 2026-01-20 15:13:46.513395903 +0000 UTC m=+0.021946670 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:13:46 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:13:46 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45b4727d270238dc26ecc0e37388a7172e6bea5024b08b3011ccd3531d437d94/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.641 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922026.6407802, 810e72a9-536d-4214-956b-9d5216cce8ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.642 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] VM Started (Lifecycle Event)#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.645 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.649 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:13:46 np0005588920 podman[294887]: 2026-01-20 15:13:46.649901468 +0000 UTC m=+0.158452245 container init 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.652 226890 INFO nova.virt.libvirt.driver [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Instance spawned successfully.#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.652 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:13:46 np0005588920 podman[294887]: 2026-01-20 15:13:46.656096399 +0000 UTC m=+0.164647156 container start 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.671 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.676 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.678 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.679 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.679 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.679 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.680 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.680 226890 DEBUG nova.virt.libvirt.driver [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:46 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [NOTICE]   (294912) : New worker (294914) forked
Jan 20 10:13:46 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [NOTICE]   (294912) : Loading success.
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.706 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.707 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922026.6408734, 810e72a9-536d-4214-956b-9d5216cce8ff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.708 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.737 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.740 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922026.6489086, 810e72a9-536d-4214-956b-9d5216cce8ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.740 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.763 226890 INFO nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Took 5.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.764 226890 DEBUG nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.805 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.809 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.833 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.848 226890 INFO nova.compute.manager [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Took 6.63 seconds to build instance.#033[00m
Jan 20 10:13:46 np0005588920 nova_compute[226886]: 2026-01-20 15:13:46.869 226890 DEBUG oslo_concurrency.lockutils [None req-f29704ca-8e0c-4021-9a41-8371308f852d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:47 np0005588920 nova_compute[226886]: 2026-01-20 15:13:47.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:47.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:48.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:49 np0005588920 nova_compute[226886]: 2026-01-20 15:13:49.767 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:49.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.189 226890 DEBUG nova.compute.manager [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.189 226890 DEBUG oslo_concurrency.lockutils [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.189 226890 DEBUG oslo_concurrency.lockutils [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.190 226890 DEBUG oslo_concurrency.lockutils [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.190 226890 DEBUG nova.compute.manager [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] No waiting events found dispatching network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:50 np0005588920 nova_compute[226886]: 2026-01-20 15:13:50.190 226890 WARNING nova.compute.manager [req-8bb088e3-c6b6-486f-b9d3-7f31115bea12 req-075bfc11-3030-423a-ac9a-e995dcd3da8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received unexpected event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:13:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:50.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:51 np0005588920 nova_compute[226886]: 2026-01-20 15:13:51.011 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922016.0107245, 389ff8e0-c114-4960-9561-f6ffef743efa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:51 np0005588920 nova_compute[226886]: 2026-01-20 15:13:51.012 226890 INFO nova.compute.manager [-] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:13:51 np0005588920 nova_compute[226886]: 2026-01-20 15:13:51.029 226890 DEBUG nova.compute.manager [None req-01bec704-cc98-46c2-834d-af6231bb0ed4 - - - - - -] [instance: 389ff8e0-c114-4960-9561-f6ffef743efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:51.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:52.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.494 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.717 226890 DEBUG nova.compute.manager [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.717 226890 DEBUG nova.compute.manager [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing instance network info cache due to event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.718 226890 DEBUG oslo_concurrency.lockutils [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.718 226890 DEBUG oslo_concurrency.lockutils [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:52 np0005588920 nova_compute[226886]: 2026-01-20 15:13:52.718 226890 DEBUG nova.network.neutron [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:52 np0005588920 podman[294923]: 2026-01-20 15:13:52.995313548 +0000 UTC m=+0.079602899 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:13:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:53.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:54.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:54 np0005588920 nova_compute[226886]: 2026-01-20 15:13:54.769 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:55 np0005588920 nova_compute[226886]: 2026-01-20 15:13:55.631 226890 DEBUG nova.network.neutron [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updated VIF entry in instance network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:55 np0005588920 nova_compute[226886]: 2026-01-20 15:13:55.632 226890 DEBUG nova.network.neutron [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:55 np0005588920 nova_compute[226886]: 2026-01-20 15:13:55.684 226890 DEBUG oslo_concurrency.lockutils [req-7dd96d5a-0551-4f54-a9ad-23b7f28586a7 req-56ad96f6-f6bf-4a9d-9332-f6e917c941b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:55.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:56.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.834 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.834 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.848 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.925 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.926 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.932 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:13:56 np0005588920 nova_compute[226886]: 2026-01-20 15:13:56.932 226890 INFO nova.compute.claims [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.037 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.098 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.099 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.099 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.100 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:57 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3414784352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.477 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.485 226890 DEBUG nova.compute.provider_tree [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.495 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.509 226890 DEBUG nova.scheduler.client.report [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.551 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.553 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.614 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.614 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.631 226890 INFO nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.647 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.764 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.765 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.766 226890 INFO nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Creating image(s)#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.792 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.817 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.840 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.843 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000058s ======
Jan 20 10:13:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:57.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.874 226890 DEBUG nova.policy [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9cc4ce3e069479ba9c789b378a68a1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fff727019f86407498e83d7948d54962', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.907 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.907 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.908 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.908 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.932 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:57 np0005588920 nova_compute[226886]: 2026-01-20 15:13:57.935 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:58.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.391 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.475 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] resizing rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.571 226890 DEBUG nova.objects.instance [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'migration_context' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.587 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.588 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Ensure instance console log exists: /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.589 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.589 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:58 np0005588920 nova_compute[226886]: 2026-01-20 15:13:58.589 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.148 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.178 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.179 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.179 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.202 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.202 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.203 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.203 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.203 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.440 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Successfully created port: b8bc07e2-c826-408c-a1a5-f45ad76b5888 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:13:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:59 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2197494371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.695 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.772 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:13:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:59.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.921 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:13:59 np0005588920 nova_compute[226886]: 2026-01-20 15:13:59.922 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.079 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.080 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3977MB free_disk=20.921680450439453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.080 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.080 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:14:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:00.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.227 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 810e72a9-536d-4214-956b-9d5216cce8ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.228 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 54a13784-2a60-4b16-8208-d9b9d0e3033e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.228 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.228 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:14:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:00Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:d4:e6 10.100.0.14
Jan 20 10:14:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:00Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:d4:e6 10.100.0.14
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.344 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.415 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.415 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.429 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.447 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.502 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.887 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Successfully updated port: b8bc07e2-c826-408c-a1a5-f45ad76b5888 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:14:00 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:00 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3060118312' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.934 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.938 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.960 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.991 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:14:00 np0005588920 nova_compute[226886]: 2026-01-20 15:14:00.991 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.040 226890 DEBUG nova.compute.manager [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.040 226890 DEBUG nova.compute.manager [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing instance network info cache due to event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.040 226890 DEBUG oslo_concurrency.lockutils [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.040 226890 DEBUG oslo_concurrency.lockutils [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.040 226890 DEBUG nova.network.neutron [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing network info cache for port b8bc07e2-c826-408c-a1a5-f45ad76b5888 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.069 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.453 226890 DEBUG nova.network.neutron [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.809 226890 DEBUG nova.network.neutron [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.828 226890 DEBUG oslo_concurrency.lockutils [req-96828c49-b3f3-463f-ac3a-6cadc928d4ec req-6458ce2a-ce65-4f4a-b416-f5326fc740f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.829 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.829 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:14:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:01.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:01 np0005588920 nova_compute[226886]: 2026-01-20 15:14:01.967 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:14:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:02.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:02 np0005588920 nova_compute[226886]: 2026-01-20 15:14:02.497 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.158 226890 DEBUG nova.network.neutron [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.179 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.179 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance network_info: |[{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.182 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Start _get_guest_xml network_info=[{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.185 226890 WARNING nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.189 226890 DEBUG nova.virt.libvirt.host [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.190 226890 DEBUG nova.virt.libvirt.host [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.192 226890 DEBUG nova.virt.libvirt.host [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.193 226890 DEBUG nova.virt.libvirt.host [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.194 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.194 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.195 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.195 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.195 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.195 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.195 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.196 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.196 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.196 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.196 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.196 226890 DEBUG nova.virt.hardware [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.199 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1568540187' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.632 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.655 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:03 np0005588920 nova_compute[226886]: 2026-01-20 15:14:03.661 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:03.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2257608367' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.113 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.115 226890 DEBUG nova.virt.libvirt.vif [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=186,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-otgoutp6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=54a13784-2a60-4b16-8208-d9b9d0e3033e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.115 226890 DEBUG nova.network.os_vif_util [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.116 226890 DEBUG nova.network.os_vif_util [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.117 226890 DEBUG nova.objects.instance [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.139 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <uuid>54a13784-2a60-4b16-8208-d9b9d0e3033e</uuid>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <name>instance-000000ba</name>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:name>multiattach-server-1</nova:name>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:14:03</nova:creationTime>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:user uuid="e9cc4ce3e069479ba9c789b378a68a1d">tempest-AttachVolumeMultiAttachTest-418194625-project-member</nova:user>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:project uuid="fff727019f86407498e83d7948d54962">tempest-AttachVolumeMultiAttachTest-418194625</nova:project>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <nova:port uuid="b8bc07e2-c826-408c-a1a5-f45ad76b5888">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="serial">54a13784-2a60-4b16-8208-d9b9d0e3033e</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="uuid">54a13784-2a60-4b16-8208-d9b9d0e3033e</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/54a13784-2a60-4b16-8208-d9b9d0e3033e_disk">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:ce:5d:51"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <target dev="tapb8bc07e2-c8"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/console.log" append="off"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:14:04 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:14:04 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:14:04 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:14:04 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.140 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Preparing to wait for external event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.141 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.141 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.142 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.142 226890 DEBUG nova.virt.libvirt.vif [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=186,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-otgoutp6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=54a13784-2a60-4b16-8208-d9b9d0e3033e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.143 226890 DEBUG nova.network.os_vif_util [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.143 226890 DEBUG nova.network.os_vif_util [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.144 226890 DEBUG os_vif [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.145 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.145 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.149 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.149 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8bc07e2-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.150 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb8bc07e2-c8, col_values=(('external_ids', {'iface-id': 'b8bc07e2-c826-408c-a1a5-f45ad76b5888', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:5d:51', 'vm-uuid': '54a13784-2a60-4b16-8208-d9b9d0e3033e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:04 np0005588920 NetworkManager[49076]: <info>  [1768922044.1521] manager: (tapb8bc07e2-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.161 226890 INFO os_vif [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8')#033[00m
Jan 20 10:14:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:04.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.276 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.277 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.277 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:ce:5d:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.277 226890 INFO nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Using config drive#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.304 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.537 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.538 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.538 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.664 226890 INFO nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Creating config drive at /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.671 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnyfgi1ok execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.809 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnyfgi1ok" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.839 226890 DEBUG nova.storage.rbd_utils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:04 np0005588920 nova_compute[226886]: 2026-01-20 15:14:04.843 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:04 np0005588920 podman[295287]: 2026-01-20 15:14:04.983029284 +0000 UTC m=+0.056186017 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.022 226890 DEBUG oslo_concurrency.processutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config 54a13784-2a60-4b16-8208-d9b9d0e3033e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.023 226890 INFO nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Deleting local config drive /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e/disk.config because it was imported into RBD.#033[00m
Jan 20 10:14:05 np0005588920 kernel: tapb8bc07e2-c8: entered promiscuous mode
Jan 20 10:14:05 np0005588920 NetworkManager[49076]: <info>  [1768922045.0682] manager: (tapb8bc07e2-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Jan 20 10:14:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:05Z|00854|binding|INFO|Claiming lport b8bc07e2-c826-408c-a1a5-f45ad76b5888 for this chassis.
Jan 20 10:14:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:05Z|00855|binding|INFO|b8bc07e2-c826-408c-a1a5-f45ad76b5888: Claiming fa:16:3e:ce:5d:51 10.100.0.6
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.077 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:5d:51 10.100.0.6'], port_security=['fa:16:3e:ce:5d:51 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '54a13784-2a60-4b16-8208-d9b9d0e3033e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b8bc07e2-c826-408c-a1a5-f45ad76b5888) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.079 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b8bc07e2-c826-408c-a1a5-f45ad76b5888 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.080 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:14:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:05Z|00856|binding|INFO|Setting lport b8bc07e2-c826-408c-a1a5-f45ad76b5888 ovn-installed in OVS
Jan 20 10:14:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:05Z|00857|binding|INFO|Setting lport b8bc07e2-c826-408c-a1a5-f45ad76b5888 up in Southbound
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.088 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.096 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c9449f40-f805-4e78-9349-7cf0a9c56a50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 systemd-udevd[295338]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:05 np0005588920 systemd-machined[196121]: New machine qemu-89-instance-000000ba.
Jan 20 10:14:05 np0005588920 NetworkManager[49076]: <info>  [1768922045.1150] device (tapb8bc07e2-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:14:05 np0005588920 NetworkManager[49076]: <info>  [1768922045.1160] device (tapb8bc07e2-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:14:05 np0005588920 systemd[1]: Started Virtual Machine qemu-89-instance-000000ba.
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.127 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7b6499-3bb9-487a-ac1d-dab5cd4d9d59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.130 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[4d945804-1e84-4549-8c8d-ee64a553fdf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.157 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[356351fe-cdff-42e7-bd1e-99fd8b3e951a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.173 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcfb560-d47f-4cd6-ab9a-c11ed4479666]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 574, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 574, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295348, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.189 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[48260da1-fc0e-4fde-bf49-f14af0c7997a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295351, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295351, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.191 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.192 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.193 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.194 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.194 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.194 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:05.195 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.297 226890 DEBUG nova.compute.manager [req-553365f5-17d8-4783-aa1e-c72a8ac6335f req-04c91864-a794-4792-8c35-b95930c2d26a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.298 226890 DEBUG oslo_concurrency.lockutils [req-553365f5-17d8-4783-aa1e-c72a8ac6335f req-04c91864-a794-4792-8c35-b95930c2d26a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.298 226890 DEBUG oslo_concurrency.lockutils [req-553365f5-17d8-4783-aa1e-c72a8ac6335f req-04c91864-a794-4792-8c35-b95930c2d26a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.299 226890 DEBUG oslo_concurrency.lockutils [req-553365f5-17d8-4783-aa1e-c72a8ac6335f req-04c91864-a794-4792-8c35-b95930c2d26a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:05 np0005588920 nova_compute[226886]: 2026-01-20 15:14:05.299 226890 DEBUG nova.compute.manager [req-553365f5-17d8-4783-aa1e-c72a8ac6335f req-04c91864-a794-4792-8c35-b95930c2d26a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Processing event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:14:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:05.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:14:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:06.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.459 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922046.4594285, 54a13784-2a60-4b16-8208-d9b9d0e3033e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.460 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.462 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.466 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.469 226890 INFO nova.virt.libvirt.driver [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance spawned successfully.#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.469 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.482 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.489 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.490 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.490 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.491 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.491 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.492 226890 DEBUG nova.virt.libvirt.driver [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.494 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.551 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.551 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922046.4596126, 54a13784-2a60-4b16-8208-d9b9d0e3033e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.552 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.575 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.579 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922046.46502, 54a13784-2a60-4b16-8208-d9b9d0e3033e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.579 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.582 226890 INFO nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Took 8.82 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.583 226890 DEBUG nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.598 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.602 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.628 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.644 226890 INFO nova.compute.manager [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Took 9.74 seconds to build instance.#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.658 226890 DEBUG oslo_concurrency.lockutils [None req-d5473b4f-cb01-4892-a7e2-0741286e298c e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:06 np0005588920 nova_compute[226886]: 2026-01-20 15:14:06.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.392 226890 DEBUG nova.compute.manager [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.392 226890 DEBUG oslo_concurrency.lockutils [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.392 226890 DEBUG oslo_concurrency.lockutils [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.392 226890 DEBUG oslo_concurrency.lockutils [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.393 226890 DEBUG nova.compute.manager [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] No waiting events found dispatching network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.393 226890 WARNING nova.compute.manager [req-94252570-f7ba-4bf5-beca-0c627f2e04a1 req-55684194-14cb-4208-aa88-b04c060e3c08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received unexpected event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.498 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:07 np0005588920 nova_compute[226886]: 2026-01-20 15:14:07.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:07.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:08.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:08 np0005588920 nova_compute[226886]: 2026-01-20 15:14:08.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.825 226890 DEBUG nova.compute.manager [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.826 226890 DEBUG nova.compute.manager [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing instance network info cache due to event network-changed-0c07d11d-c06a-497a-9dbd-975adce07e97. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.826 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.826 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:09 np0005588920 nova_compute[226886]: 2026-01-20 15:14:09.827 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Refreshing network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:09.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:10.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:11.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.181 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updated VIF entry in instance network info cache for port 0c07d11d-c06a-497a-9dbd-975adce07e97. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.182 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.200 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.201 226890 DEBUG nova.compute.manager [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.201 226890 DEBUG nova.compute.manager [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing instance network info cache due to event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.201 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.201 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.202 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing network info cache for port b8bc07e2-c826-408c-a1a5-f45ad76b5888 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:12.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.500 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:12 np0005588920 nova_compute[226886]: 2026-01-20 15:14:12.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:14:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.292 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.293 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.309 226890 DEBUG nova.objects.instance [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.349 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.634 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updated VIF entry in instance network info cache for port b8bc07e2-c826-408c-a1a5-f45ad76b5888. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.634 226890 DEBUG nova.network.neutron [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.655 226890 DEBUG oslo_concurrency.lockutils [req-236464de-88b7-4a7c-b687-9ab0e3240517 req-85761260-a792-4b62-bc02-374d093cb326 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.723 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.724 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.724 226890 INFO nova.compute.manager [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Attaching volume 8ac6f07e-10cd-4304-b732-9202123dbda4 to /dev/vdb#033[00m
Jan 20 10:14:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:13.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.958 226890 DEBUG os_brick.utils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.960 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.973 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.973 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[2ace5ec1-15b7-49f7-9c51-086e09426b13]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.975 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.983 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.983 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[83c390e8-9c0e-4244-93b4-c0da6b5ed438]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.985 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.993 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.993 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[2b63af4f-fab3-47ce-b6c7-2c8fc9fbd026]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.994 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[42bbe0bd-ebf4-4d6e-893f-b411b075c7d4]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:13 np0005588920 nova_compute[226886]: 2026-01-20 15:14:13.995 226890 DEBUG oslo_concurrency.processutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.026 226890 DEBUG oslo_concurrency.processutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "nvme version" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.028 226890 DEBUG os_brick.initiator.connectors.lightos [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.028 226890 DEBUG os_brick.initiator.connectors.lightos [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.029 226890 DEBUG os_brick.initiator.connectors.lightos [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.029 226890 DEBUG os_brick.utils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.029 226890 DEBUG nova.virt.block_device [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating existing volume attachment record: 840bce26-18f1-490c-9a94-c132dde0458b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.153 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:14.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.794 226890 DEBUG nova.objects.instance [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.831 226890 DEBUG nova.virt.libvirt.driver [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Attempting to attach volume 8ac6f07e-10cd-4304-b732-9202123dbda4 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.834 226890 DEBUG nova.virt.libvirt.guest [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:14 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:14:14 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:14 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:14 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:14 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.965 226890 DEBUG nova.virt.libvirt.driver [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.966 226890 DEBUG nova.virt.libvirt.driver [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.966 226890 DEBUG nova.virt.libvirt.driver [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:14 np0005588920 nova_compute[226886]: 2026-01-20 15:14:14.966 226890 DEBUG nova.virt.libvirt.driver [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:cc:d4:e6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:14:15 np0005588920 nova_compute[226886]: 2026-01-20 15:14:15.193 226890 DEBUG oslo_concurrency.lockutils [None req-20744187-da75-465c-a96a-02951ddbf9b9 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:15.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.201 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.202 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.219 226890 DEBUG nova.objects.instance [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:16.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.261 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:16.475 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:16.475 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:16.476 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.586 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.587 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.587 226890 INFO nova.compute.manager [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Attaching volume 8ac6f07e-10cd-4304-b732-9202123dbda4 to /dev/vdb#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.710 226890 DEBUG os_brick.utils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.711 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.722 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.722 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc42081-e8b4-4c07-b4d7-db66f5faebaf]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.723 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.732 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.732 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[624c66c4-a826-41f8-b228-baa0faa1a714]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.734 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.745 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.745 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab81628-bd5a-4cd6-8134-6176bd24cdcb]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.746 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[d4713c0b-5efd-4ab1-ad9f-1029233b82eb]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.747 226890 DEBUG oslo_concurrency.processutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.776 226890 DEBUG oslo_concurrency.processutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.779 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.779 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.779 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.780 226890 DEBUG os_brick.utils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:14:16 np0005588920 nova_compute[226886]: 2026-01-20 15:14:16.780 226890 DEBUG nova.virt.block_device [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating existing volume attachment record: 907f5af7-31e3-4736-be4c-e2f5a674a9d7 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:14:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2888658016' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.429 226890 DEBUG nova.objects.instance [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.460 226890 DEBUG nova.virt.libvirt.driver [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Attempting to attach volume 8ac6f07e-10cd-4304-b732-9202123dbda4 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.462 226890 DEBUG nova.virt.libvirt.guest [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:17 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:14:17 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:17 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:17 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:17 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.502 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.568 226890 DEBUG nova.virt.libvirt.driver [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.569 226890 DEBUG nova.virt.libvirt.driver [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.569 226890 DEBUG nova.virt.libvirt.driver [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.569 226890 DEBUG nova.virt.libvirt.driver [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:ce:5d:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.737 226890 DEBUG oslo_concurrency.lockutils [None req-f53ecd7d-4d9c-49e9-8b5b-09353eb87835 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.740 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.740 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:14:17 np0005588920 nova_compute[226886]: 2026-01-20 15:14:17.772 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:14:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:17.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:18.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.155 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.542 226890 DEBUG oslo_concurrency.lockutils [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.542 226890 DEBUG oslo_concurrency.lockutils [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.557 226890 INFO nova.compute.manager [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Detaching volume 8ac6f07e-10cd-4304-b732-9202123dbda4#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.738 226890 INFO nova.virt.block_device [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Attempting to driver detach volume 8ac6f07e-10cd-4304-b732-9202123dbda4 from mountpoint /dev/vdb#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.746 226890 DEBUG nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Attempting to detach device vdb from instance 810e72a9-536d-4214-956b-9d5216cce8ff from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.746 226890 DEBUG nova.virt.libvirt.guest [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:19 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.752 226890 INFO nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance 810e72a9-536d-4214-956b-9d5216cce8ff from the persistent domain config.#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.752 226890 DEBUG nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 810e72a9-536d-4214-956b-9d5216cce8ff from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.753 226890 DEBUG nova.virt.libvirt.guest [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:14:19 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:19 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.804 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768922059.8039458, 810e72a9-536d-4214-956b-9d5216cce8ff => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.806 226890 DEBUG nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 810e72a9-536d-4214-956b-9d5216cce8ff _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.807 226890 INFO nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance 810e72a9-536d-4214-956b-9d5216cce8ff from the live domain config.#033[00m
Jan 20 10:14:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:19.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:19 np0005588920 nova_compute[226886]: 2026-01-20 15:14:19.994 226890 INFO nova.virt.libvirt.driver [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Detected multiple connections on this host for volume: 8ac6f07e-10cd-4304-b732-9202123dbda4, skipping target disconnect.#033[00m
Jan 20 10:14:20 np0005588920 nova_compute[226886]: 2026-01-20 15:14:20.167 226890 DEBUG nova.objects.instance [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:20 np0005588920 nova_compute[226886]: 2026-01-20 15:14:20.201 226890 DEBUG oslo_concurrency.lockutils [None req-95b909fc-be4b-4a83-8621-c1692f1dcf9e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:20.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:20Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:5d:51 10.100.0.6
Jan 20 10:14:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:20Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:5d:51 10.100.0.6
Jan 20 10:14:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.392 226890 DEBUG oslo_concurrency.lockutils [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.393 226890 DEBUG oslo_concurrency.lockutils [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.405 226890 INFO nova.compute.manager [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Detaching volume 8ac6f07e-10cd-4304-b732-9202123dbda4#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.543 226890 INFO nova.virt.block_device [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Attempting to driver detach volume 8ac6f07e-10cd-4304-b732-9202123dbda4 from mountpoint /dev/vdb#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.551 226890 DEBUG nova.virt.libvirt.driver [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Attempting to detach device vdb from instance 54a13784-2a60-4b16-8208-d9b9d0e3033e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.552 226890 DEBUG nova.virt.libvirt.guest [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:21 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.670 226890 INFO nova.virt.libvirt.driver [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance 54a13784-2a60-4b16-8208-d9b9d0e3033e from the persistent domain config.#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.670 226890 DEBUG nova.virt.libvirt.driver [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 54a13784-2a60-4b16-8208-d9b9d0e3033e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.671 226890 DEBUG nova.virt.libvirt.guest [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-8ac6f07e-10cd-4304-b732-9202123dbda4">
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <serial>8ac6f07e-10cd-4304-b732-9202123dbda4</serial>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:14:21 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:14:21 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.868 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768922061.8679976, 54a13784-2a60-4b16-8208-d9b9d0e3033e => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.872 226890 DEBUG nova.virt.libvirt.driver [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 54a13784-2a60-4b16-8208-d9b9d0e3033e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:14:21 np0005588920 nova_compute[226886]: 2026-01-20 15:14:21.874 226890 INFO nova.virt.libvirt.driver [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance 54a13784-2a60-4b16-8208-d9b9d0e3033e from the live domain config.#033[00m
Jan 20 10:14:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:21.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:22 np0005588920 nova_compute[226886]: 2026-01-20 15:14:22.120 226890 DEBUG nova.objects.instance [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:22 np0005588920 nova_compute[226886]: 2026-01-20 15:14:22.169 226890 DEBUG oslo_concurrency.lockutils [None req-39766609-9b07-4f5c-8eba-fbbcd0f25685 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:22.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:22 np0005588920 nova_compute[226886]: 2026-01-20 15:14:22.504 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588920 nova_compute[226886]: 2026-01-20 15:14:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:23.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:24 np0005588920 podman[295457]: 2026-01-20 15:14:24.03503052 +0000 UTC m=+0.117438581 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 20 10:14:24 np0005588920 nova_compute[226886]: 2026-01-20 15:14:24.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:24.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:25.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:26.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 20 10:14:27 np0005588920 nova_compute[226886]: 2026-01-20 15:14:27.506 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:27.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:28.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:29 np0005588920 nova_compute[226886]: 2026-01-20 15:14:29.159 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:29.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:30.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 20 10:14:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:31.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:32.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:32 np0005588920 nova_compute[226886]: 2026-01-20 15:14:32.510 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:33.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:34 np0005588920 nova_compute[226886]: 2026-01-20 15:14:34.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:14:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:34.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:14:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:36.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:36.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:36 np0005588920 podman[295484]: 2026-01-20 15:14:36.285506498 +0000 UTC m=+0.046540567 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:37 np0005588920 nova_compute[226886]: 2026-01-20 15:14:37.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:38.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:38.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:39 np0005588920 nova_compute[226886]: 2026-01-20 15:14:39.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:40.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:40.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:42.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:42.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.409 226890 DEBUG nova.compute.manager [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.409 226890 DEBUG nova.compute.manager [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing instance network info cache due to event network-changed-b8bc07e2-c826-408c-a1a5-f45ad76b5888. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.410 226890 DEBUG oslo_concurrency.lockutils [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.410 226890 DEBUG oslo_concurrency.lockutils [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.410 226890 DEBUG nova.network.neutron [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Refreshing network info cache for port b8bc07e2-c826-408c-a1a5-f45ad76b5888 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:42 np0005588920 nova_compute[226886]: 2026-01-20 15:14:42.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:14:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:14:44 np0005588920 nova_compute[226886]: 2026-01-20 15:14:44.165 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:44.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:44.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:44 np0005588920 nova_compute[226886]: 2026-01-20 15:14:44.704 226890 DEBUG nova.network.neutron [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updated VIF entry in instance network info cache for port b8bc07e2-c826-408c-a1a5-f45ad76b5888. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:44 np0005588920 nova_compute[226886]: 2026-01-20 15:14:44.705 226890 DEBUG nova.network.neutron [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:44 np0005588920 nova_compute[226886]: 2026-01-20 15:14:44.743 226890 DEBUG oslo_concurrency.lockutils [req-310a5e96-2ee4-4540-9804-b78843c9f796 req-04da0502-44d3-467a-bb2b-cebbe31b2ce8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:45.717 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:45.718 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:14:45 np0005588920 nova_compute[226886]: 2026-01-20 15:14:45.718 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:46.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:46.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.905 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.906 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.920 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.991 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.992 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.997 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:14:46 np0005588920 nova_compute[226886]: 2026-01-20 15:14:46.998 226890 INFO nova.compute.claims [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.183 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:47 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1149942708' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.655 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.661 226890 DEBUG nova.compute.provider_tree [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.678 226890 DEBUG nova.scheduler.client.report [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.701 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.703 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.765 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.766 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.791 226890 INFO nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.818 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:14:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.940 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.941 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.942 226890 INFO nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Creating image(s)#033[00m
Jan 20 10:14:47 np0005588920 nova_compute[226886]: 2026-01-20 15:14:47.966 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.001 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.030 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.034 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.103 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.104 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.105 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.105 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.129 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.133 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b3504af3-390e-4ab0-8af6-15749a887d8f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:48.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:48.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.423 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b3504af3-390e-4ab0-8af6-15749a887d8f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.502 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] resizing rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.541 226890 DEBUG nova.policy [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9cc4ce3e069479ba9c789b378a68a1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fff727019f86407498e83d7948d54962', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.619 226890 DEBUG nova.objects.instance [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'migration_context' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.637 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.637 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Ensure instance console log exists: /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.639 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.640 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:48 np0005588920 nova_compute[226886]: 2026-01-20 15:14:48.640 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:49 np0005588920 nova_compute[226886]: 2026-01-20 15:14:49.166 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:50 np0005588920 nova_compute[226886]: 2026-01-20 15:14:50.173 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Successfully created port: 1b18c40e-cce7-4971-98d2-c95ec41c9040 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:14:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:50.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.050 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Successfully updated port: 1b18c40e-cce7-4971-98d2-c95ec41c9040 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.064 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.064 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.064 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.144 226890 DEBUG nova.compute.manager [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.144 226890 DEBUG nova.compute.manager [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing instance network info cache due to event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.145 226890 DEBUG oslo_concurrency.lockutils [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:51 np0005588920 nova_compute[226886]: 2026-01-20 15:14:51.220 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.024 226890 DEBUG nova.network.neutron [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.074 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.074 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance network_info: |[{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.075 226890 DEBUG oslo_concurrency.lockutils [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.075 226890 DEBUG nova.network.neutron [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.079 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Start _get_guest_xml network_info=[{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.084 226890 WARNING nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.123 226890 DEBUG nova.virt.libvirt.host [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.124 226890 DEBUG nova.virt.libvirt.host [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.133 226890 DEBUG nova.virt.libvirt.host [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.133 226890 DEBUG nova.virt.libvirt.host [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.134 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.134 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.135 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.135 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.135 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.135 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.136 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.136 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.136 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.136 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.136 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.137 226890 DEBUG nova.virt.hardware [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.139 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:52.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:52.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:52 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1450403844' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.614 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.638 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:52 np0005588920 nova_compute[226886]: 2026-01-20 15:14:52.643 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2924848263' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.113 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.116 226890 DEBUG nova.virt.libvirt.vif [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=189,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-akp879yg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:14:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=b3504af3-390e-4ab0-8af6-15749a887d8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.116 226890 DEBUG nova.network.os_vif_util [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.117 226890 DEBUG nova.network.os_vif_util [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.118 226890 DEBUG nova.objects.instance [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_devices' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.138 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <uuid>b3504af3-390e-4ab0-8af6-15749a887d8f</uuid>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <name>instance-000000bd</name>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:name>multiattach-server-1</nova:name>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:14:52</nova:creationTime>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:user uuid="e9cc4ce3e069479ba9c789b378a68a1d">tempest-AttachVolumeMultiAttachTest-418194625-project-member</nova:user>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:project uuid="fff727019f86407498e83d7948d54962">tempest-AttachVolumeMultiAttachTest-418194625</nova:project>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <nova:port uuid="1b18c40e-cce7-4971-98d2-c95ec41c9040">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="serial">b3504af3-390e-4ab0-8af6-15749a887d8f</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="uuid">b3504af3-390e-4ab0-8af6-15749a887d8f</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/b3504af3-390e-4ab0-8af6-15749a887d8f_disk">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:eb:05:c3"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <target dev="tap1b18c40e-cc"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/console.log" append="off"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:14:53 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:14:53 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:14:53 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:14:53 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.140 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Preparing to wait for external event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.141 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.141 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.141 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.142 226890 DEBUG nova.virt.libvirt.vif [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=189,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-akp879yg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:14:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=b3504af3-390e-4ab0-8af6-15749a887d8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.142 226890 DEBUG nova.network.os_vif_util [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.143 226890 DEBUG nova.network.os_vif_util [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.143 226890 DEBUG os_vif [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.144 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.146 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.151 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b18c40e-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.152 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b18c40e-cc, col_values=(('external_ids', {'iface-id': '1b18c40e-cce7-4971-98d2-c95ec41c9040', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:05:c3', 'vm-uuid': 'b3504af3-390e-4ab0-8af6-15749a887d8f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:53 np0005588920 NetworkManager[49076]: <info>  [1768922093.1561] manager: (tap1b18c40e-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/407)
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.158 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.162 226890 INFO os_vif [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc')#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.236 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.237 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.237 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:eb:05:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.238 226890 INFO nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Using config drive#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.262 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.739 226890 INFO nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Creating config drive at /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.745 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83c0tsxi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.881 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83c0tsxi" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.912 226890 DEBUG nova.storage.rbd_utils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:53 np0005588920 nova_compute[226886]: 2026-01-20 15:14:53.916 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.062 226890 DEBUG oslo_concurrency.processutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config b3504af3-390e-4ab0-8af6-15749a887d8f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.063 226890 INFO nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Deleting local config drive /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/disk.config because it was imported into RBD.#033[00m
Jan 20 10:14:54 np0005588920 NetworkManager[49076]: <info>  [1768922094.1135] manager: (tap1b18c40e-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/408)
Jan 20 10:14:54 np0005588920 kernel: tap1b18c40e-cc: entered promiscuous mode
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.115 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:54Z|00858|binding|INFO|Claiming lport 1b18c40e-cce7-4971-98d2-c95ec41c9040 for this chassis.
Jan 20 10:14:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:54Z|00859|binding|INFO|1b18c40e-cce7-4971-98d2-c95ec41c9040: Claiming fa:16:3e:eb:05:c3 10.100.0.7
Jan 20 10:14:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:54Z|00860|binding|INFO|Setting lport 1b18c40e-cce7-4971-98d2-c95ec41c9040 ovn-installed in OVS
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.134 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 systemd-machined[196121]: New machine qemu-90-instance-000000bd.
Jan 20 10:14:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:54Z|00861|binding|INFO|Setting lport 1b18c40e-cce7-4971-98d2-c95ec41c9040 up in Southbound
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.152 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:05:c3 10.100.0.7'], port_security=['fa:16:3e:eb:05:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b3504af3-390e-4ab0-8af6-15749a887d8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=1b18c40e-cce7-4971-98d2-c95ec41c9040) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.154 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 1b18c40e-cce7-4971-98d2-c95ec41c9040 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:14:54 np0005588920 systemd[1]: Started Virtual Machine qemu-90-instance-000000bd.
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.155 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.171 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d4409f82-a981-4cfe-8a1e-c2f9cfa68c9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 systemd-udevd[296021]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:54 np0005588920 NetworkManager[49076]: <info>  [1768922094.1868] device (tap1b18c40e-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:14:54 np0005588920 NetworkManager[49076]: <info>  [1768922094.1874] device (tap1b18c40e-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.202 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[48c625ca-3fa7-47da-838e-26f511aebb6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.205 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[72e87d8d-b4ef-4bb1-b095-4cd6b4d38160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.230 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6781bd30-94b2-4f1f-95f5-01228379f6a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:54.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.245 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4d99776f-3992-4a1a-a097-33c8f0753129]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296049, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 podman[296007]: 2026-01-20 15:14:54.25865163 +0000 UTC m=+0.114539936 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.262 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[98350251-b5ac-4383-acdb-9747bc9284cd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296050, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296050, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.263 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.265 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.267 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.267 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.267 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.267 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:54.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:14:54Z|00862|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.564 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922094.5638146, b3504af3-390e-4ab0-8af6-15749a887d8f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.565 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] VM Started (Lifecycle Event)#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.572 226890 DEBUG nova.network.neutron [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updated VIF entry in instance network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.572 226890 DEBUG nova.network.neutron [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.591 226890 DEBUG oslo_concurrency.lockutils [req-e63b77a7-e2e5-4070-a55b-e510da51fcae req-42861376-edc2-4aa2-ae19-3e3f7151923d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.597 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.602 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922094.5641506, b3504af3-390e-4ab0-8af6-15749a887d8f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.603 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.627 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.630 226890 DEBUG nova.compute.manager [req-b3cfa183-5dcf-438a-9ea3-07e8f4626bf1 req-b25938db-a511-47b2-bee0-cddcad74c5be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.630 226890 DEBUG oslo_concurrency.lockutils [req-b3cfa183-5dcf-438a-9ea3-07e8f4626bf1 req-b25938db-a511-47b2-bee0-cddcad74c5be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.631 226890 DEBUG oslo_concurrency.lockutils [req-b3cfa183-5dcf-438a-9ea3-07e8f4626bf1 req-b25938db-a511-47b2-bee0-cddcad74c5be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.631 226890 DEBUG oslo_concurrency.lockutils [req-b3cfa183-5dcf-438a-9ea3-07e8f4626bf1 req-b25938db-a511-47b2-bee0-cddcad74c5be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.631 226890 DEBUG nova.compute.manager [req-b3cfa183-5dcf-438a-9ea3-07e8f4626bf1 req-b25938db-a511-47b2-bee0-cddcad74c5be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Processing event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.632 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.636 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.637 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922094.6357903, b3504af3-390e-4ab0-8af6-15749a887d8f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.638 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.641 226890 INFO nova.virt.libvirt.driver [-] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance spawned successfully.#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.641 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.666 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.673 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.678 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.678 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.679 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.679 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.680 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.680 226890 DEBUG nova.virt.libvirt.driver [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.710 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:14:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:14:54.719 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.749 226890 INFO nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.750 226890 DEBUG nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.824 226890 INFO nova.compute.manager [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Took 7.86 seconds to build instance.#033[00m
Jan 20 10:14:54 np0005588920 nova_compute[226886]: 2026-01-20 15:14:54.847 226890 DEBUG oslo_concurrency.lockutils [None req-b99905f3-61d6-438f-b1cb-66e0cefbed93 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:14:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:56.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:14:56 np0005588920 nova_compute[226886]: 2026-01-20 15:14:56.735 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:56 np0005588920 nova_compute[226886]: 2026-01-20 15:14:56.735 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:14:56 np0005588920 nova_compute[226886]: 2026-01-20 15:14:56.736 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:14:57 np0005588920 nova_compute[226886]: 2026-01-20 15:14:57.542 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:58 np0005588920 nova_compute[226886]: 2026-01-20 15:14:58.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:14:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:58.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.489 226890 DEBUG nova.compute.manager [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.490 226890 DEBUG oslo_concurrency.lockutils [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.490 226890 DEBUG oslo_concurrency.lockutils [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.490 226890 DEBUG oslo_concurrency.lockutils [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.491 226890 DEBUG nova.compute.manager [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] No waiting events found dispatching network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.491 226890 WARNING nova.compute.manager [req-e2b13fe4-5879-4556-8c69-07d5fc918783 req-9ec1f0d8-4bb7-4153-ac41-3313114fcc9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received unexpected event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.928 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.929 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.930 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:14:59 np0005588920 nova_compute[226886]: 2026-01-20 15:14:59.930 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:00.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:00.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:02 np0005588920 nova_compute[226886]: 2026-01-20 15:15:02.104 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [{"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:02.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:02.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:02 np0005588920 nova_compute[226886]: 2026-01-20 15:15:02.544 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:02 np0005588920 nova_compute[226886]: 2026-01-20 15:15:02.709 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-810e72a9-536d-4214-956b-9d5216cce8ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:02 np0005588920 nova_compute[226886]: 2026-01-20 15:15:02.710 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:15:02 np0005588920 nova_compute[226886]: 2026-01-20 15:15:02.710 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.773 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.774 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.774 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.774 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:15:03 np0005588920 nova_compute[226886]: 2026-01-20 15:15:03.775 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.126 226890 DEBUG nova.compute.manager [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.126 226890 DEBUG nova.compute.manager [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing instance network info cache due to event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.127 226890 DEBUG oslo_concurrency.lockutils [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.127 226890 DEBUG oslo_concurrency.lockutils [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.128 226890 DEBUG nova.network.neutron [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:15:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:04 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1795334721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:04 np0005588920 nova_compute[226886]: 2026-01-20 15:15:04.208 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:04.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.036 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.037 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.040 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.041 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.044 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.045 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.233 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.235 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3675MB free_disk=20.830204010009766GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.235 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.236 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:06.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.466 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 810e72a9-536d-4214-956b-9d5216cce8ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.467 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 54a13784-2a60-4b16-8208-d9b9d0e3033e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.468 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance b3504af3-390e-4ab0-8af6-15749a887d8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.468 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.469 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:15:06 np0005588920 nova_compute[226886]: 2026-01-20 15:15:06.670 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:06 np0005588920 podman[296139]: 2026-01-20 15:15:06.989025862 +0000 UTC m=+0.056779135 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 20 10:15:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2592459286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.139 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.144 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.186 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.239 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.239 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.547 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.876 226890 DEBUG nova.network.neutron [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updated VIF entry in instance network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:15:07 np0005588920 nova_compute[226886]: 2026-01-20 15:15:07.877 226890 DEBUG nova.network.neutron [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.158 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.254 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.255 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.255 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.255 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:08.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:08.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:08 np0005588920 nova_compute[226886]: 2026-01-20 15:15:08.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:15:08 np0005588920 ovn_controller[133971]: 2026-01-20T15:15:08Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:05:c3 10.100.0.7
Jan 20 10:15:08 np0005588920 ovn_controller[133971]: 2026-01-20T15:15:08Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:05:c3 10.100.0.7
Jan 20 10:15:09 np0005588920 nova_compute[226886]: 2026-01-20 15:15:09.510 226890 DEBUG oslo_concurrency.lockutils [req-94caed2a-7438-4e1f-81b5-573e962e085c req-e8681e1d-6ad9-4064-a212-0c0d9c5ddca9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:09 np0005588920 nova_compute[226886]: 2026-01-20 15:15:09.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:10.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:12 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4149569028' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:12.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:12 np0005588920 nova_compute[226886]: 2026-01-20 15:15:12.549 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:13 np0005588920 nova_compute[226886]: 2026-01-20 15:15:13.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:15:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1223840242' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:15:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:15:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1223840242' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:15:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:14.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:16 np0005588920 nova_compute[226886]: 2026-01-20 15:15:16.070 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:16 np0005588920 nova_compute[226886]: 2026-01-20 15:15:16.071 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:16 np0005588920 nova_compute[226886]: 2026-01-20 15:15:16.110 226890 DEBUG nova.objects.instance [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:16 np0005588920 nova_compute[226886]: 2026-01-20 15:15:16.204 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:16.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:16.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:16.475 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:16.476 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:16.476 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:17 np0005588920 nova_compute[226886]: 2026-01-20 15:15:17.551 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:18 np0005588920 nova_compute[226886]: 2026-01-20 15:15:18.162 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:18.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:18.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:18 np0005588920 nova_compute[226886]: 2026-01-20 15:15:18.999 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:18.999 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.000 226890 INFO nova.compute.manager [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Attaching volume 933c5c7a-f496-4bcc-b304-68156c235fe5 to /dev/vdb#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.216 226890 DEBUG os_brick.utils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.218 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.229 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.229 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f2cb4af0-199e-40aa-bc44-ecd0083e1b84]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.230 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.238 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.238 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[f60fb80c-d98a-4453-9de3-95e382b03818]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.239 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.250 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.250 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6b030205-898b-4d98-98e7-a69b513a190f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.251 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3b52ad7e-3d27-43bc-a7d3-ad169c1df36f]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.251 226890 DEBUG oslo_concurrency.processutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.279 226890 DEBUG oslo_concurrency.processutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.281 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.282 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.282 226890 DEBUG os_brick.initiator.connectors.lightos [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.282 226890 DEBUG os_brick.utils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:15:19 np0005588920 nova_compute[226886]: 2026-01-20 15:15:19.283 226890 DEBUG nova.virt.block_device [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating existing volume attachment record: b874ec4a-e334-4a3a-9131-ace7a891cf94 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:15:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:20.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:20.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1049306304' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.392 226890 DEBUG nova.objects.instance [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.435 226890 DEBUG nova.virt.libvirt.driver [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Attempting to attach volume 933c5c7a-f496-4bcc-b304-68156c235fe5 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.438 226890 DEBUG nova.virt.libvirt.guest [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-933c5c7a-f496-4bcc-b304-68156c235fe5">
Jan 20 10:15:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:15:21 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <serial>933c5c7a-f496-4bcc-b304-68156c235fe5</serial>
Jan 20 10:15:21 np0005588920 nova_compute[226886]:  <shareable/>
Jan 20 10:15:21 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:15:21 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.576 226890 DEBUG nova.virt.libvirt.driver [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.577 226890 DEBUG nova.virt.libvirt.driver [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.577 226890 DEBUG nova.virt.libvirt.driver [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:21 np0005588920 nova_compute[226886]: 2026-01-20 15:15:21.577 226890 DEBUG nova.virt.libvirt.driver [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:eb:05:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:15:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:22.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:22.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:22 np0005588920 nova_compute[226886]: 2026-01-20 15:15:22.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:23 np0005588920 nova_compute[226886]: 2026-01-20 15:15:23.014 226890 DEBUG oslo_concurrency.lockutils [None req-ee0c20a7-448e-4aa3-86cf-ca2c355e1dbb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:23 np0005588920 nova_compute[226886]: 2026-01-20 15:15:23.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:23 np0005588920 nova_compute[226886]: 2026-01-20 15:15:23.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:24.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:24.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:24 np0005588920 podman[296188]: 2026-01-20 15:15:24.998128032 +0000 UTC m=+0.082070871 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:15:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:15:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:26.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:15:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:26.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:27 np0005588920 nova_compute[226886]: 2026-01-20 15:15:27.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:28 np0005588920 nova_compute[226886]: 2026-01-20 15:15:28.167 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:28.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:28.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:30.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:30.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:32.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:32.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:32 np0005588920 nova_compute[226886]: 2026-01-20 15:15:32.639 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:33 np0005588920 nova_compute[226886]: 2026-01-20 15:15:33.169 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:34.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:34.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:34 np0005588920 nova_compute[226886]: 2026-01-20 15:15:34.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:15:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1108637769' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:15:35 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:15:35 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1108637769' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:15:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:36.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:37 np0005588920 nova_compute[226886]: 2026-01-20 15:15:37.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 20 10:15:37 np0005588920 podman[296214]: 2026-01-20 15:15:37.975137166 +0000 UTC m=+0.053188860 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:15:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:38 np0005588920 nova_compute[226886]: 2026-01-20 15:15:38.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:38.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:40.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:41 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/810396698' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:42.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:42.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:42 np0005588920 nova_compute[226886]: 2026-01-20 15:15:42.642 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 20 10:15:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:43 np0005588920 nova_compute[226886]: 2026-01-20 15:15:43.174 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:44.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:44.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:15:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3184790100' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:15:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:15:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3184790100' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:15:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:46.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:46.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 20 10:15:47 np0005588920 nova_compute[226886]: 2026-01-20 15:15:47.667 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:47 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:48.162 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:15:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:48.163 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:15:48 np0005588920 nova_compute[226886]: 2026-01-20 15:15:48.162 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:48 np0005588920 nova_compute[226886]: 2026-01-20 15:15:48.175 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:48.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:50.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:50.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:15:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:15:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:52.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:52.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:52 np0005588920 nova_compute[226886]: 2026-01-20 15:15:52.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:52 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:53 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:15:53.164 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:53 np0005588920 nova_compute[226886]: 2026-01-20 15:15:53.176 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:53 np0005588920 ovn_controller[133971]: 2026-01-20T15:15:53Z|00863|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:15:53 np0005588920 nova_compute[226886]: 2026-01-20 15:15:53.651 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 20 10:15:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:54.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:54.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:56 np0005588920 podman[296366]: 2026-01-20 15:15:56.001711493 +0000 UTC m=+0.085736797 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 20 10:15:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:56.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:56.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:57 np0005588920 nova_compute[226886]: 2026-01-20 15:15:57.671 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:57 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e405 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:58 np0005588920 nova_compute[226886]: 2026-01-20 15:15:58.178 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 10:15:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 10:15:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:15:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:58.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:58 np0005588920 nova_compute[226886]: 2026-01-20 15:15:58.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:58 np0005588920 nova_compute[226886]: 2026-01-20 15:15:58.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:15:59 np0005588920 nova_compute[226886]: 2026-01-20 15:15:59.310 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:59 np0005588920 nova_compute[226886]: 2026-01-20 15:15:59.310 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:59 np0005588920 nova_compute[226886]: 2026-01-20 15:15:59.310 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:16:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:00.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:00.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.379 226890 DEBUG nova.compute.manager [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.379 226890 DEBUG nova.compute.manager [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing instance network info cache due to event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.380 226890 DEBUG oslo_concurrency.lockutils [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.380 226890 DEBUG oslo_concurrency.lockutils [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.380 226890 DEBUG nova.network.neutron [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.695 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [{"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.709 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-54a13784-2a60-4b16-8208-d9b9d0e3033e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.710 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.710 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.734 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.734 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.735 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.735 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:16:01 np0005588920 nova_compute[226886]: 2026-01-20 15:16:01.735 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 20 10:16:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:01Z|00864|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1509921417' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.195 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.312 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.312 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.316 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.316 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.320 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.320 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.321 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:02.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:02.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.480 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.481 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3578MB free_disk=20.78484344482422GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.481 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.481 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.617 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 810e72a9-536d-4214-956b-9d5216cce8ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.617 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 54a13784-2a60-4b16-8208-d9b9d0e3033e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.618 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance b3504af3-390e-4ab0-8af6-15749a887d8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.618 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.618 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.703 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:02 np0005588920 nova_compute[226886]: 2026-01-20 15:16:02.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3815396852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.177 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.179 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.185 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.205 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.207 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.207 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.471 226890 DEBUG nova.network.neutron [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updated VIF entry in instance network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.472 226890 DEBUG nova.network.neutron [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:03 np0005588920 nova_compute[226886]: 2026-01-20 15:16:03.517 226890 DEBUG oslo_concurrency.lockutils [req-47cc341c-02d3-4165-bab1-9a584aa32d3e req-5e1e5ffa-e4e4-4c22-92c5-c2a2105d9bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:03 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #57. Immutable memtables: 13.
Jan 20 10:16:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:04.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:04.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:06.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:06.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2847299956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:07 np0005588920 nova_compute[226886]: 2026-01-20 15:16:07.222 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:07 np0005588920 nova_compute[226886]: 2026-01-20 15:16:07.223 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:07 np0005588920 nova_compute[226886]: 2026-01-20 15:16:07.223 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:07 np0005588920 nova_compute[226886]: 2026-01-20 15:16:07.722 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:07 np0005588920 nova_compute[226886]: 2026-01-20 15:16:07.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:08 np0005588920 nova_compute[226886]: 2026-01-20 15:16:08.180 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:08.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:08.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:08 np0005588920 nova_compute[226886]: 2026-01-20 15:16:08.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:08 np0005588920 podman[296487]: 2026-01-20 15:16:08.954499162 +0000 UTC m=+0.041689555 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:16:09 np0005588920 nova_compute[226886]: 2026-01-20 15:16:09.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:10.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:10.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:10 np0005588920 nova_compute[226886]: 2026-01-20 15:16:10.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:10 np0005588920 nova_compute[226886]: 2026-01-20 15:16:10.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:16:11 np0005588920 nova_compute[226886]: 2026-01-20 15:16:11.270 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:11 np0005588920 nova_compute[226886]: 2026-01-20 15:16:11.270 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:11 np0005588920 nova_compute[226886]: 2026-01-20 15:16:11.271 226890 DEBUG nova.network.neutron [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:16:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:12.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:12.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:12 np0005588920 nova_compute[226886]: 2026-01-20 15:16:12.723 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:13 np0005588920 nova_compute[226886]: 2026-01-20 15:16:13.182 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:14.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:14 np0005588920 nova_compute[226886]: 2026-01-20 15:16:14.613 226890 DEBUG nova.network.neutron [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:14 np0005588920 nova_compute[226886]: 2026-01-20 15:16:14.637 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:14 np0005588920 nova_compute[226886]: 2026-01-20 15:16:14.800 226890 DEBUG nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 10:16:14 np0005588920 nova_compute[226886]: 2026-01-20 15:16:14.800 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Creating file /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/93e91fcabdb64d37abb00564a6042e0e.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 10:16:14 np0005588920 nova_compute[226886]: 2026-01-20 15:16:14.801 226890 DEBUG oslo_concurrency.processutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/93e91fcabdb64d37abb00564a6042e0e.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.280 226890 DEBUG oslo_concurrency.processutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/93e91fcabdb64d37abb00564a6042e0e.tmp" returned: 1 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.282 226890 DEBUG oslo_concurrency.processutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f/93e91fcabdb64d37abb00564a6042e0e.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.283 226890 DEBUG nova.virt.libvirt.volume.remotefs [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Creating directory /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.283 226890 DEBUG oslo_concurrency.processutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.482 226890 DEBUG oslo_concurrency.processutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b3504af3-390e-4ab0-8af6-15749a887d8f" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:15 np0005588920 nova_compute[226886]: 2026-01-20 15:16:15.485 226890 DEBUG nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:16:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:16.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:16.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:16.477 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:16.477 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:16.478 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:17 np0005588920 nova_compute[226886]: 2026-01-20 15:16:17.771 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:17 np0005588920 kernel: tap1b18c40e-cc (unregistering): left promiscuous mode
Jan 20 10:16:17 np0005588920 NetworkManager[49076]: <info>  [1768922177.8406] device (tap1b18c40e-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:16:17 np0005588920 nova_compute[226886]: 2026-01-20 15:16:17.853 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:17Z|00865|binding|INFO|Releasing lport 1b18c40e-cce7-4971-98d2-c95ec41c9040 from this chassis (sb_readonly=0)
Jan 20 10:16:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:17Z|00866|binding|INFO|Setting lport 1b18c40e-cce7-4971-98d2-c95ec41c9040 down in Southbound
Jan 20 10:16:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:17Z|00867|binding|INFO|Removing iface tap1b18c40e-cc ovn-installed in OVS
Jan 20 10:16:17 np0005588920 nova_compute[226886]: 2026-01-20 15:16:17.857 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.863 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:05:c3 10.100.0.7'], port_security=['fa:16:3e:eb:05:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b3504af3-390e-4ab0-8af6-15749a887d8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=1b18c40e-cce7-4971-98d2-c95ec41c9040) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.865 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 1b18c40e-cce7-4971-98d2-c95ec41c9040 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.866 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:16:17 np0005588920 nova_compute[226886]: 2026-01-20 15:16:17.870 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.883 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d99ea5ae-f31a-446f-94b8-a156568b9c53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:17 np0005588920 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000bd.scope: Deactivated successfully.
Jan 20 10:16:17 np0005588920 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000bd.scope: Consumed 17.062s CPU time.
Jan 20 10:16:17 np0005588920 systemd-machined[196121]: Machine qemu-90-instance-000000bd terminated.
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.917 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[36d529ff-6605-4144-a05b-b177f0f288b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.922 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cc85046a-d948-4958-ad2f-6f837d3db2e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.959 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[522ca4d4-3b3e-4137-a381-d85b2eb58000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:17.986 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d37d96-d363-4c78-a8ee-9ba1aaad2c8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296519, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.004 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fbeddb88-7599-4999-a335-9b30587c3572]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296520, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296520, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.007 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.009 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.013 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.013 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.013 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.013 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:18.014 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.073 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:18.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:18.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.502 226890 INFO nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.508 226890 INFO nova.virt.libvirt.driver [-] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Instance destroyed successfully.#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.509 226890 DEBUG nova.virt.libvirt.vif [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=189,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-akp879yg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:16:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=b3504af3-390e-4ab0-8af6-15749a887d8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:eb:05:c3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.510 226890 DEBUG nova.network.os_vif_util [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:eb:05:c3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.511 226890 DEBUG nova.network.os_vif_util [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.512 226890 DEBUG os_vif [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.515 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b18c40e-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.516 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.522 226890 INFO os_vif [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc')#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.844 226890 DEBUG nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.845 226890 DEBUG nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:18 np0005588920 nova_compute[226886]: 2026-01-20 15:16:18.845 226890 DEBUG nova.virt.libvirt.driver [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.265 226890 DEBUG nova.compute.manager [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-unplugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.265 226890 DEBUG oslo_concurrency.lockutils [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.265 226890 DEBUG oslo_concurrency.lockutils [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.265 226890 DEBUG oslo_concurrency.lockutils [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.266 226890 DEBUG nova.compute.manager [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] No waiting events found dispatching network-vif-unplugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.266 226890 WARNING nova.compute.manager [req-8fa38861-3989-4ada-9654-1f0d33a06532 req-a3dad340-5358-4541-b7f8-e83b413f4838 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received unexpected event network-vif-unplugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.551 226890 DEBUG neutronclient.v2_0.client [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.649 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.649 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:19 np0005588920 nova_compute[226886]: 2026-01-20 15:16:19.649 226890 DEBUG oslo_concurrency.lockutils [None req-5364e26b-38e9-42a0-8c37-45b28590c7f3 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:20.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:20.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.393 226890 DEBUG nova.compute.manager [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.393 226890 DEBUG oslo_concurrency.lockutils [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.394 226890 DEBUG oslo_concurrency.lockutils [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.394 226890 DEBUG oslo_concurrency.lockutils [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.394 226890 DEBUG nova.compute.manager [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] No waiting events found dispatching network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:21 np0005588920 nova_compute[226886]: 2026-01-20 15:16:21.394 226890 WARNING nova.compute.manager [req-aeb0375b-ef7d-43fa-93fe-3d0b035fe22d req-c5f9169a-8c31-4949-b3b4-3ea5ec07c753 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received unexpected event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 10:16:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:22.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:22.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:22 np0005588920 nova_compute[226886]: 2026-01-20 15:16:22.775 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.499 226890 DEBUG nova.compute.manager [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.500 226890 DEBUG nova.compute.manager [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing instance network info cache due to event network-changed-1b18c40e-cce7-4971-98d2-c95ec41c9040. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.500 226890 DEBUG oslo_concurrency.lockutils [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.500 226890 DEBUG oslo_concurrency.lockutils [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.501 226890 DEBUG nova.network.neutron [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Refreshing network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:16:23 np0005588920 nova_compute[226886]: 2026-01-20 15:16:23.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:24.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:24.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:25 np0005588920 nova_compute[226886]: 2026-01-20 15:16:25.294 226890 DEBUG nova.network.neutron [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updated VIF entry in instance network info cache for port 1b18c40e-cce7-4971-98d2-c95ec41c9040. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:25 np0005588920 nova_compute[226886]: 2026-01-20 15:16:25.295 226890 DEBUG nova.network.neutron [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:25 np0005588920 nova_compute[226886]: 2026-01-20 15:16:25.320 226890 DEBUG oslo_concurrency.lockutils [req-bcc71ad0-6110-4605-a304-ec4c6225d80a req-5078ae13-45a0-45b6-85a8-1329f79908da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:26.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:26.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 20 10:16:27 np0005588920 podman[296533]: 2026-01-20 15:16:27.003215342 +0000 UTC m=+0.093204665 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:16:27 np0005588920 nova_compute[226886]: 2026-01-20 15:16:27.777 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:28.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:28.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:28 np0005588920 nova_compute[226886]: 2026-01-20 15:16:28.553 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.082 226890 DEBUG nova.compute.manager [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.083 226890 DEBUG oslo_concurrency.lockutils [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.083 226890 DEBUG oslo_concurrency.lockutils [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.083 226890 DEBUG oslo_concurrency.lockutils [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.083 226890 DEBUG nova.compute.manager [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] No waiting events found dispatching network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:29 np0005588920 nova_compute[226886]: 2026-01-20 15:16:29.084 226890 WARNING nova.compute.manager [req-0262bb10-7234-4790-8a15-b7122d118195 req-a6c76d11-4074-483b-a5a3-a4506ecce1bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received unexpected event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 10:16:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:30.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:30.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.210 226890 DEBUG nova.compute.manager [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.210 226890 DEBUG oslo_concurrency.lockutils [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.210 226890 DEBUG oslo_concurrency.lockutils [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.211 226890 DEBUG oslo_concurrency.lockutils [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.211 226890 DEBUG nova.compute.manager [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] No waiting events found dispatching network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.211 226890 WARNING nova.compute.manager [req-b54914f9-8397-4a36-85f4-29fe705e20f3 req-c13f15bc-9c4b-4a6b-a0b0-cef4e8d2c506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Received unexpected event network-vif-plugged-1b18c40e-cce7-4971-98d2-c95ec41c9040 for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.228 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "b3504af3-390e-4ab0-8af6-15749a887d8f" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.228 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:31 np0005588920 nova_compute[226886]: 2026-01-20 15:16:31.228 226890 DEBUG nova.compute.manager [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Going to confirm migration 22 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.102 226890 DEBUG neutronclient.v2_0.client [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1b18c40e-cce7-4971-98d2-c95ec41c9040 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.103 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.103 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.103 226890 DEBUG nova.network.neutron [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.104 226890 DEBUG nova.objects.instance [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'info_cache' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:16:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:32.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:32.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:32 np0005588920 nova_compute[226886]: 2026-01-20 15:16:32.779 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.087 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922178.0855103, b3504af3-390e-4ab0-8af6-15749a887d8f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.087 226890 INFO nova.compute.manager [-] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.119 226890 DEBUG nova.compute.manager [None req-ff951a02-f9d8-46ac-a60f-5c463685372c - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.124 226890 DEBUG nova.compute.manager [None req-ff951a02-f9d8-46ac-a60f-5c463685372c - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.145 226890 INFO nova.compute.manager [None req-ff951a02-f9d8-46ac-a60f-5c463685372c - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 20 10:16:33 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:33Z|00868|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.602 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.831 226890 DEBUG nova.network.neutron [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Updating instance_info_cache with network_info: [{"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.851 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-b3504af3-390e-4ab0-8af6-15749a887d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.852 226890 DEBUG nova.objects.instance [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'migration_context' on Instance uuid b3504af3-390e-4ab0-8af6-15749a887d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:16:33 np0005588920 nova_compute[226886]: 2026-01-20 15:16:33.935 226890 DEBUG nova.storage.rbd_utils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] removing snapshot(nova-resize) on rbd image(b3504af3-390e-4ab0-8af6-15749a887d8f_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:16:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:34.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:34.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.720 226890 DEBUG nova.virt.libvirt.vif [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='multiattach-server-1',id=189,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:16:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-akp879yg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:16:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=b3504af3-390e-4ab0-8af6-15749a887d8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.720 226890 DEBUG nova.network.os_vif_util [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "address": "fa:16:3e:eb:05:c3", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b18c40e-cc", "ovs_interfaceid": "1b18c40e-cce7-4971-98d2-c95ec41c9040", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.721 226890 DEBUG nova.network.os_vif_util [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.721 226890 DEBUG os_vif [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.723 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.723 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b18c40e-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.723 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.725 226890 INFO os_vif [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:05:c3,bridge_name='br-int',has_traffic_filtering=True,id=1b18c40e-cce7-4971-98d2-c95ec41c9040,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b18c40e-cc')#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.725 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:34 np0005588920 nova_compute[226886]: 2026-01-20 15:16:34.725 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:35 np0005588920 nova_compute[226886]: 2026-01-20 15:16:35.560 226890 DEBUG oslo_concurrency.processutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:36 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/796769356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.120 226890 DEBUG oslo_concurrency.processutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.127 226890 DEBUG nova.compute.provider_tree [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.146 226890 DEBUG nova.scheduler.client.report [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.203 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.353 226890 INFO nova.scheduler.client.report [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Deleted allocation for migration 80437f3e-3a02-4f66-bb65-e87ce242092f#033[00m
Jan 20 10:16:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:36.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:36 np0005588920 nova_compute[226886]: 2026-01-20 15:16:36.405 226890 DEBUG oslo_concurrency.lockutils [None req-6192b708-5692-47cf-8a20-6b4912e43001 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "b3504af3-390e-4ab0-8af6-15749a887d8f" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:36.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:37 np0005588920 nova_compute[226886]: 2026-01-20 15:16:37.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e408 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:38.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:38.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:38 np0005588920 nova_compute[226886]: 2026-01-20 15:16:38.605 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:39 np0005588920 podman[296619]: 2026-01-20 15:16:39.957834238 +0000 UTC m=+0.048100961 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:16:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:40.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:40.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:41 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 20 10:16:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:42.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:42.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:42 np0005588920 nova_compute[226886]: 2026-01-20 15:16:42.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.877616) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202877669, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2434, "num_deletes": 255, "total_data_size": 5483757, "memory_usage": 5560240, "flush_reason": "Manual Compaction"}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202901587, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3592662, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 66466, "largest_seqno": 68895, "table_properties": {"data_size": 3582980, "index_size": 6047, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20909, "raw_average_key_size": 20, "raw_value_size": 3563296, "raw_average_value_size": 3524, "num_data_blocks": 263, "num_entries": 1011, "num_filter_entries": 1011, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922008, "oldest_key_time": 1768922008, "file_creation_time": 1768922202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 24125 microseconds, and 7445 cpu microseconds.
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.901736) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3592662 bytes OK
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.901793) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904610) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904631) EVENT_LOG_v1 {"time_micros": 1768922202904624, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904650) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5472980, prev total WAL file size 5472980, number of live WAL files 2.
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.906500) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3508KB)], [135(9718KB)]
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202906546, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 13544170, "oldest_snapshot_seqno": -1}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 9254 keys, 11672965 bytes, temperature: kUnknown
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202981391, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 11672965, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11612952, "index_size": 35772, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 243082, "raw_average_key_size": 26, "raw_value_size": 11450166, "raw_average_value_size": 1237, "num_data_blocks": 1364, "num_entries": 9254, "num_filter_entries": 9254, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.981632) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 11672965 bytes
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.994316) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.8 rd, 155.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 9.5 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(7.0) write-amplify(3.2) OK, records in: 9779, records dropped: 525 output_compression: NoCompression
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.994361) EVENT_LOG_v1 {"time_micros": 1768922202994344, "job": 86, "event": "compaction_finished", "compaction_time_micros": 74914, "compaction_time_cpu_micros": 26137, "output_level": 6, "num_output_files": 1, "total_output_size": 11672965, "num_input_records": 9779, "num_output_records": 9254, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202995156, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202997331, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.906394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.997381) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.997386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.997388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.997389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:16:42.997390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:43 np0005588920 nova_compute[226886]: 2026-01-20 15:16:43.606 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:44.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:44.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:46.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:46.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:47 np0005588920 nova_compute[226886]: 2026-01-20 15:16:47.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:48.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:48.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:48 np0005588920 nova_compute[226886]: 2026-01-20 15:16:48.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:16:48Z|00869|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:16:48 np0005588920 nova_compute[226886]: 2026-01-20 15:16:48.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:50.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:50.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:50.536 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:16:50 np0005588920 nova_compute[226886]: 2026-01-20 15:16:50.537 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:50.537 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:16:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:52.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:52 np0005588920 nova_compute[226886]: 2026-01-20 15:16:52.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:53 np0005588920 nova_compute[226886]: 2026-01-20 15:16:53.386 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:53 np0005588920 nova_compute[226886]: 2026-01-20 15:16:53.610 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:54.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:56.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:56.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:16:56.539 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:56 np0005588920 ceph-osd[79820]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 20 10:16:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:16:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:16:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:16:57 np0005588920 nova_compute[226886]: 2026-01-20 15:16:57.793 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:57 np0005588920 nova_compute[226886]: 2026-01-20 15:16:57.826 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:57 np0005588920 podman[296769]: 2026-01-20 15:16:57.994124566 +0000 UTC m=+0.078590852 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:16:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:58.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:16:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:58.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:58 np0005588920 nova_compute[226886]: 2026-01-20 15:16:58.612 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:59 np0005588920 nova_compute[226886]: 2026-01-20 15:16:59.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:59 np0005588920 nova_compute[226886]: 2026-01-20 15:16:59.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:16:59 np0005588920 nova_compute[226886]: 2026-01-20 15:16:59.750 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: b3504af3-390e-4ab0-8af6-15749a887d8f] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 20 10:16:59 np0005588920 nova_compute[226886]: 2026-01-20 15:16:59.750 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:17:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:00.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:00.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.760 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:17:01 np0005588920 nova_compute[226886]: 2026-01-20 15:17:01.760 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1955683945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.227 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.306 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.307 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.311 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.311 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000b8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:02.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.463 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.464 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3765MB free_disk=20.805618286132812GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.464 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.464 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.586 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 810e72a9-536d-4214-956b-9d5216cce8ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.587 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 54a13784-2a60-4b16-8208-d9b9d0e3033e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.587 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.587 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.632 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:02 np0005588920 nova_compute[226886]: 2026-01-20 15:17:02.795 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/548683424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.086 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.094 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.117 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.142 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.142 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:03 np0005588920 nova_compute[226886]: 2026-01-20 15:17:03.616 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:17:03 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:17:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:04.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:04.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.098 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.099 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.118 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.190 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.190 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.196 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.197 226890 INFO nova.compute.claims [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.369 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3529627429' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.825 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.831 226890 DEBUG nova.compute.provider_tree [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.847 226890 DEBUG nova.scheduler.client.report [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.869 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.869 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.917 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.917 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.938 226890 INFO nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.953 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:17:05 np0005588920 nova_compute[226886]: 2026-01-20 15:17:05.988 226890 INFO nova.virt.block_device [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Booting with volume 972ce456-91dd-4e78-8ae7-4dd3cb2257e8 at /dev/vda#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.161 226890 DEBUG os_brick.utils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.163 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.174 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.174 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[79cc2d24-a96c-4295-b3d3-fb7412c5027e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.175 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.182 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.183 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[659469de-582b-4099-abd6-8460fd9b670a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.184 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.192 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.192 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[734bcdb8-01e2-4c29-9458-a6caf4841105]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.193 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6377106e-6f26-4141-b470-9904caab2f77]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.193 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.221 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.223 226890 DEBUG os_brick.initiator.connectors.lightos [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.224 226890 DEBUG os_brick.initiator.connectors.lightos [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.224 226890 DEBUG os_brick.initiator.connectors.lightos [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.224 226890 DEBUG os_brick.utils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] <== get_connector_properties: return (62ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.224 226890 DEBUG nova.virt.block_device [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Updating existing volume attachment record: 2f967f97-fbb3-4eef-b805-b5ffb0cafc02 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:17:06 np0005588920 nova_compute[226886]: 2026-01-20 15:17:06.242 226890 DEBUG nova.policy [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9cc4ce3e069479ba9c789b378a68a1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fff727019f86407498e83d7948d54962', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:17:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:06.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:06.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:17:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3133455725' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.143 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.144 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.368 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.370 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.371 226890 INFO nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Creating image(s)#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.371 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.371 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Ensure instance console log exists: /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.372 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.372 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.372 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.427 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Successfully created port: 45db3ebd-1dc8-4c76-999b-2f2a1172317c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:07 np0005588920 nova_compute[226886]: 2026-01-20 15:17:07.798 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:08.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:08.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:08 np0005588920 nova_compute[226886]: 2026-01-20 15:17:08.653 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:08 np0005588920 nova_compute[226886]: 2026-01-20 15:17:08.897 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Successfully updated port: 45db3ebd-1dc8-4c76-999b-2f2a1172317c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:17:08 np0005588920 nova_compute[226886]: 2026-01-20 15:17:08.912 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:08 np0005588920 nova_compute[226886]: 2026-01-20 15:17:08.913 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:08 np0005588920 nova_compute[226886]: 2026-01-20 15:17:08.913 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.342 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.381 226890 DEBUG nova.compute.manager [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-changed-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.382 226890 DEBUG nova.compute.manager [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Refreshing instance network info cache due to event network-changed-45db3ebd-1dc8-4c76-999b-2f2a1172317c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.382 226890 DEBUG oslo_concurrency.lockutils [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:09 np0005588920 nova_compute[226886]: 2026-01-20 15:17:09.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.287 226890 DEBUG nova.network.neutron [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Updating instance_info_cache with network_info: [{"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.329 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.329 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Instance network_info: |[{"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.329 226890 DEBUG oslo_concurrency.lockutils [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.329 226890 DEBUG nova.network.neutron [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Refreshing network info cache for port 45db3ebd-1dc8-4c76-999b-2f2a1172317c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.333 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Start _get_guest_xml network_info=[{"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': 'disk', 'boot_index': 0, 'delete_on_termination': False, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-972ce456-91dd-4e78-8ae7-4dd3cb2257e8', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '972ce456-91dd-4e78-8ae7-4dd3cb2257e8', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13d313ff-27f8-40d0-96d4-5ddb1605cad9', 'attached_at': '', 'detached_at': '', 'volume_id': '972ce456-91dd-4e78-8ae7-4dd3cb2257e8', 'serial': '972ce456-91dd-4e78-8ae7-4dd3cb2257e8', 'multiattach': True}, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'attachment_id': '2f967f97-fbb3-4eef-b805-b5ffb0cafc02', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.338 226890 WARNING nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.342 226890 DEBUG nova.virt.libvirt.host [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.343 226890 DEBUG nova.virt.libvirt.host [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.349 226890 DEBUG nova.virt.libvirt.host [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.349 226890 DEBUG nova.virt.libvirt.host [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.350 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.350 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.351 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.351 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.351 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.351 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.352 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.352 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.352 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.352 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.352 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.353 226890 DEBUG nova.virt.hardware [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.378 226890 DEBUG nova.storage.rbd_utils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.382 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:10.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:10.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:17:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3979331012' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.824 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.849 226890 DEBUG nova.virt.libvirt.vif [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-594388070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-594388070',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-1hpn9plq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:17:05Z,user_data=None,user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=13d313ff-27f8-40d0-96d4-5ddb1605cad9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.850 226890 DEBUG nova.network.os_vif_util [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.851 226890 DEBUG nova.network.os_vif_util [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.853 226890 DEBUG nova.objects.instance [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_devices' on Instance uuid 13d313ff-27f8-40d0-96d4-5ddb1605cad9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.872 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <uuid>13d313ff-27f8-40d0-96d4-5ddb1605cad9</uuid>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <name>instance-000000c0</name>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeMultiAttachTest-server-594388070</nova:name>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:17:10</nova:creationTime>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:user uuid="e9cc4ce3e069479ba9c789b378a68a1d">tempest-AttachVolumeMultiAttachTest-418194625-project-member</nova:user>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:project uuid="fff727019f86407498e83d7948d54962">tempest-AttachVolumeMultiAttachTest-418194625</nova:project>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <nova:port uuid="45db3ebd-1dc8-4c76-999b-2f2a1172317c">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="serial">13d313ff-27f8-40d0-96d4-5ddb1605cad9</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="uuid">13d313ff-27f8-40d0-96d4-5ddb1605cad9</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="volumes/volume-972ce456-91dd-4e78-8ae7-4dd3cb2257e8">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <serial>972ce456-91dd-4e78-8ae7-4dd3cb2257e8</serial>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <shareable/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:0f:5a:02"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <target dev="tap45db3ebd-1d"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/console.log" append="off"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:17:10 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:17:10 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:17:10 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:17:10 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.872 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Preparing to wait for external event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.873 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.873 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.873 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.874 226890 DEBUG nova.virt.libvirt.vif [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-594388070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-594388070',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-1hpn9plq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:17:05Z,user_data=None,user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=13d313ff-27f8-40d0-96d4-5ddb1605cad9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.874 226890 DEBUG nova.network.os_vif_util [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.875 226890 DEBUG nova.network.os_vif_util [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.875 226890 DEBUG os_vif [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.875 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.876 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.876 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.880 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.880 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45db3ebd-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.881 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45db3ebd-1d, col_values=(('external_ids', {'iface-id': '45db3ebd-1dc8-4c76-999b-2f2a1172317c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:5a:02', 'vm-uuid': '13d313ff-27f8-40d0-96d4-5ddb1605cad9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588920 NetworkManager[49076]: <info>  [1768922230.8831] manager: (tap45db3ebd-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.885 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.892 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.894 226890 INFO os_vif [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d')#033[00m
Jan 20 10:17:10 np0005588920 podman[296961]: 2026-01-20 15:17:10.964024566 +0000 UTC m=+0.047898095 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.964 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.965 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.965 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:0f:5a:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:17:10 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.965 226890 INFO nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Using config drive#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:10.999 226890 DEBUG nova.storage.rbd_utils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.392 226890 INFO nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Creating config drive at /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.397 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp286zbecr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.532 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp286zbecr" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.564 226890 DEBUG nova.storage.rbd_utils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] rbd image 13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.567 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config 13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.596 226890 DEBUG nova.network.neutron [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Updated VIF entry in instance network info cache for port 45db3ebd-1dc8-4c76-999b-2f2a1172317c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.597 226890 DEBUG nova.network.neutron [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Updating instance_info_cache with network_info: [{"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.619 226890 DEBUG oslo_concurrency.lockutils [req-961fa547-d8e4-4216-84a6-ede38312c782 req-48117880-70f1-49fd-a961-28d014d46688 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-13d313ff-27f8-40d0-96d4-5ddb1605cad9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.888 226890 DEBUG oslo_concurrency.processutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config 13d313ff-27f8-40d0-96d4-5ddb1605cad9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.888 226890 INFO nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Deleting local config drive /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9/disk.config because it was imported into RBD.#033[00m
Jan 20 10:17:11 np0005588920 kernel: tap45db3ebd-1d: entered promiscuous mode
Jan 20 10:17:11 np0005588920 NetworkManager[49076]: <info>  [1768922231.9328] manager: (tap45db3ebd-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/410)
Jan 20 10:17:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:11Z|00870|binding|INFO|Claiming lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c for this chassis.
Jan 20 10:17:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:11Z|00871|binding|INFO|45db3ebd-1dc8-4c76-999b-2f2a1172317c: Claiming fa:16:3e:0f:5a:02 10.100.0.11
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.935 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:11Z|00872|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c ovn-installed in OVS
Jan 20 10:17:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:11Z|00873|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c up in Southbound
Jan 20 10:17:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:11.955 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:5a:02 10.100.0.11'], port_security=['fa:16:3e:0f:5a:02 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '13d313ff-27f8-40d0-96d4-5ddb1605cad9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '2', 'neutron:security_group_ids': '278e6fc3-62b7-45c0-b1a3-c75cbe3171fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=45db3ebd-1dc8-4c76-999b-2f2a1172317c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:11 np0005588920 nova_compute[226886]: 2026-01-20 15:17:11.956 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:11.957 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 45db3ebd-1dc8-4c76-999b-2f2a1172317c in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:17:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:11.959 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:17:11 np0005588920 systemd-machined[196121]: New machine qemu-91-instance-000000c0.
Jan 20 10:17:11 np0005588920 systemd-udevd[297053]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:17:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:11.975 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[67eb63a2-1e88-4f67-b980-fd3e59b95009]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:11 np0005588920 NetworkManager[49076]: <info>  [1768922231.9804] device (tap45db3ebd-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:17:11 np0005588920 NetworkManager[49076]: <info>  [1768922231.9815] device (tap45db3ebd-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:17:11 np0005588920 systemd[1]: Started Virtual Machine qemu-91-instance-000000c0.
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.003 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5a494661-5903-46b5-910b-768a4a78da04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.006 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbc62f8-1973-4566-b16e-0e685a5dd11a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.034 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c88c2bbe-51b4-4e9a-a4e0-405b3246d557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.052 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a62a5881-2bc8-4ea7-8fdc-5a82acda8960]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297065, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.070 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5a55d8bb-9f7b-4774-999b-fd7caf87f21a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297067, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297067, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.072 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.073 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.074 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.074 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.075 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.075 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:12 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:12.075 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.187 226890 DEBUG nova.compute.manager [req-55593d6e-30d0-475a-be9d-8a27e3ce679a req-e9671fe0-e5b3-4719-a2d5-70a202f0d29e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.188 226890 DEBUG oslo_concurrency.lockutils [req-55593d6e-30d0-475a-be9d-8a27e3ce679a req-e9671fe0-e5b3-4719-a2d5-70a202f0d29e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.188 226890 DEBUG oslo_concurrency.lockutils [req-55593d6e-30d0-475a-be9d-8a27e3ce679a req-e9671fe0-e5b3-4719-a2d5-70a202f0d29e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.188 226890 DEBUG oslo_concurrency.lockutils [req-55593d6e-30d0-475a-be9d-8a27e3ce679a req-e9671fe0-e5b3-4719-a2d5-70a202f0d29e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.189 226890 DEBUG nova.compute.manager [req-55593d6e-30d0-475a-be9d-8a27e3ce679a req-e9671fe0-e5b3-4719-a2d5-70a202f0d29e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Processing event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:17:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:12.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:12.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.801 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.926 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.927 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922232.9259615, 13d313ff-27f8-40d0-96d4-5ddb1605cad9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.927 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] VM Started (Lifecycle Event)#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.930 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.933 226890 INFO nova.virt.libvirt.driver [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Instance spawned successfully.#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.934 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.957 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.963 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.966 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.967 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.967 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.967 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.968 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:12 np0005588920 nova_compute[226886]: 2026-01-20 15:17:12.968 226890 DEBUG nova.virt.libvirt.driver [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.007 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.007 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922232.926815, 13d313ff-27f8-40d0-96d4-5ddb1605cad9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.007 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:17:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.037 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.040 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922232.9296536, 13d313ff-27f8-40d0-96d4-5ddb1605cad9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.040 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.064 226890 INFO nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Took 5.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.065 226890 DEBUG nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.066 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.072 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.106 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.131 226890 INFO nova.compute.manager [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Took 7.96 seconds to build instance.#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.144 226890 DEBUG oslo_concurrency.lockutils [None req-de440e9b-7d2e-4c6a-937c-9ebf140259a6 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:13 np0005588920 nova_compute[226886]: 2026-01-20 15:17:13.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.325 226890 DEBUG nova.compute.manager [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.325 226890 DEBUG oslo_concurrency.lockutils [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.326 226890 DEBUG oslo_concurrency.lockutils [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.326 226890 DEBUG oslo_concurrency.lockutils [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.326 226890 DEBUG nova.compute.manager [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:14 np0005588920 nova_compute[226886]: 2026-01-20 15:17:14.326 226890 WARNING nova.compute.manager [req-661c5a87-68cf-4a6b-ada9-eecf34cd38ad req-2d79c9a1-3449-46ac-8856-b029534bed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received unexpected event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with vm_state active and task_state None.#033[00m
Jan 20 10:17:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:14.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:14.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 20 10:17:15 np0005588920 nova_compute[226886]: 2026-01-20 15:17:15.883 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:16.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:16.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:16.478 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:16.478 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:16.479 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 20 10:17:17 np0005588920 nova_compute[226886]: 2026-01-20 15:17:17.803 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:18.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:18.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.994 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.994 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.994 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.994 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.994 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.995 226890 INFO nova.compute.manager [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Terminating instance#033[00m
Jan 20 10:17:19 np0005588920 nova_compute[226886]: 2026-01-20 15:17:19.996 226890 DEBUG nova.compute.manager [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:17:20 np0005588920 kernel: tap45db3ebd-1d (unregistering): left promiscuous mode
Jan 20 10:17:20 np0005588920 NetworkManager[49076]: <info>  [1768922240.0404] device (tap45db3ebd-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00874|binding|INFO|Releasing lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c from this chassis (sb_readonly=0)
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00875|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c down in Southbound
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00876|binding|INFO|Removing iface tap45db3ebd-1d ovn-installed in OVS
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.053 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.060 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:5a:02 10.100.0.11'], port_security=['fa:16:3e:0f:5a:02 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '13d313ff-27f8-40d0-96d4-5ddb1605cad9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '278e6fc3-62b7-45c0-b1a3-c75cbe3171fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=45db3ebd-1dc8-4c76-999b-2f2a1172317c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.061 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 45db3ebd-1dc8-4c76-999b-2f2a1172317c in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.062 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.067 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.078 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5d82c9a8-1ffb-42e3-866e-797d0671a562]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c0.scope: Deactivated successfully.
Jan 20 10:17:20 np0005588920 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c0.scope: Consumed 8.160s CPU time.
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.103 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea4c8d5-1933-4dd5-b489-ac5b32d9fa0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.105 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[764970fa-b761-4c51-aac5-d713a3926f97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 systemd-machined[196121]: Machine qemu-91-instance-000000c0 terminated.
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.130 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[03323f60-2855-405e-91db-d0748fd8977e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.145 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[564e29e5-08ae-4682-b6d7-4b9b7663f5e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297123, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.159 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e264b514-fcda-45a5-aa24-6b1fbf088c33]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297124, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297124, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.160 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.162 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.165 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.165 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.166 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.166 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.166 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 kernel: tap45db3ebd-1d: entered promiscuous mode
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00877|binding|INFO|Claiming lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c for this chassis.
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00878|binding|INFO|45db3ebd-1dc8-4c76-999b-2f2a1172317c: Claiming fa:16:3e:0f:5a:02 10.100.0.11
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 kernel: tap45db3ebd-1d (unregistering): left promiscuous mode
Jan 20 10:17:20 np0005588920 NetworkManager[49076]: <info>  [1768922240.2183] manager: (tap45db3ebd-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/411)
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.221 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:5a:02 10.100.0.11'], port_security=['fa:16:3e:0f:5a:02 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '13d313ff-27f8-40d0-96d4-5ddb1605cad9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '278e6fc3-62b7-45c0-b1a3-c75cbe3171fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=45db3ebd-1dc8-4c76-999b-2f2a1172317c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.222 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 45db3ebd-1dc8-4c76-999b-2f2a1172317c in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.223 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.239 226890 INFO nova.virt.libvirt.driver [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Instance destroyed successfully.#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.240 226890 DEBUG nova.objects.instance [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'resources' on Instance uuid 13d313ff-27f8-40d0-96d4-5ddb1605cad9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00879|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c ovn-installed in OVS
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00880|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c up in Southbound
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.241 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[dca074cb-9c1c-4004-b4f3-22c6585d2695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00881|binding|INFO|Releasing lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c from this chassis (sb_readonly=1)
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.242 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00882|if_status|INFO|Dropped 5 log messages in last 663 seconds (most recently, 663 seconds ago) due to excessive rate
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00883|if_status|INFO|Not setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c down as sb is readonly
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00884|binding|INFO|Removing iface tap45db3ebd-1d ovn-installed in OVS
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.246 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00885|binding|INFO|Releasing lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c from this chassis (sb_readonly=0)
Jan 20 10:17:20 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:20Z|00886|binding|INFO|Setting lport 45db3ebd-1dc8-4c76-999b-2f2a1172317c down in Southbound
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.257 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.263 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:5a:02 10.100.0.11'], port_security=['fa:16:3e:0f:5a:02 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '13d313ff-27f8-40d0-96d4-5ddb1605cad9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '278e6fc3-62b7-45c0-b1a3-c75cbe3171fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=45db3ebd-1dc8-4c76-999b-2f2a1172317c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.271 226890 DEBUG nova.virt.libvirt.vif [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-594388070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-594388070',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:17:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-1hpn9plq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:17:13Z,user_data=None,user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=13d313ff-27f8-40d0-96d4-5ddb1605cad9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.271 226890 DEBUG nova.network.os_vif_util [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "address": "fa:16:3e:0f:5a:02", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45db3ebd-1d", "ovs_interfaceid": "45db3ebd-1dc8-4c76-999b-2f2a1172317c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.272 226890 DEBUG nova.network.os_vif_util [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.272 226890 DEBUG os_vif [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.274 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[627ea031-59f6-4a85-ae5c-e8df0929f650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.274 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45db3ebd-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.278 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b38737d9-c9bb-4966-880d-8daaa4142623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.280 226890 INFO os_vif [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:5a:02,bridge_name='br-int',has_traffic_filtering=True,id=45db3ebd-1dc8-4c76-999b-2f2a1172317c,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45db3ebd-1d')#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.302 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5f72a83b-fad8-4692-aff6-102a4054c0ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.320 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[63023389-5a99-4887-9c5a-eff572b58515]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297135, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.337 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7a9943-64b2-4bf3-ae51-1010b40fcf16]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297136, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297136, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.339 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.342 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.342 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.343 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.343 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.344 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 45db3ebd-1dc8-4c76-999b-2f2a1172317c in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.345 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.359 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8164e1d4-f91f-4dd7-954b-b4cbf41a479c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.386 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8c00e006-5542-4b61-95d8-cd6ae265e818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.389 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1d277e5b-273e-43f1-b842-726fa3ddd365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:20.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.417 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[573ee044-0510-4320-8d9d-c5c818c7de40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.421 226890 DEBUG nova.compute.manager [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.421 226890 DEBUG oslo_concurrency.lockutils [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.422 226890 DEBUG oslo_concurrency.lockutils [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.422 226890 DEBUG oslo_concurrency.lockutils [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.422 226890 DEBUG nova.compute.manager [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.422 226890 DEBUG nova.compute.manager [req-2d83353b-8043-4fac-bc38-0549a225f767 req-304550b7-bdab-4feb-ad87-ea342e1bf143 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.433 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c056d4-2e25-4075-b0c3-93ec35815e4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297142, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.450 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f201aa28-ea3e-47bb-93d9-029f4b51c1b0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297143, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297143, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.452 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 nova_compute[226886]: 2026-01-20 15:17:20.455 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.455 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.455 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.455 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:20 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:20.456 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.140 226890 INFO nova.virt.libvirt.driver [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Deleting instance files /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9_del#033[00m
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.140 226890 INFO nova.virt.libvirt.driver [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Deletion of /var/lib/nova/instances/13d313ff-27f8-40d0-96d4-5ddb1605cad9_del complete#033[00m
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.199 226890 INFO nova.compute.manager [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Took 1.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.199 226890 DEBUG oslo.service.loopingcall [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.200 226890 DEBUG nova.compute.manager [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:17:21 np0005588920 nova_compute[226886]: 2026-01-20 15:17:21.200 226890 DEBUG nova.network.neutron [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:17:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:22.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:22.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.518 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.519 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.519 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.519 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.519 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.520 226890 WARNING nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received unexpected event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.520 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.520 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.520 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.520 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.521 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.521 226890 WARNING nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received unexpected event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.521 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.521 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.522 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.522 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.522 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.522 226890 WARNING nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received unexpected event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.522 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.523 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.523 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.523 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.523 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.523 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-unplugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 DEBUG oslo_concurrency.lockutils [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 DEBUG nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] No waiting events found dispatching network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.524 226890 WARNING nova.compute.manager [req-dc8a943e-8ef5-4bed-97fc-124864c962d3 req-13734345-f4cd-4771-ac4d-874fe5515082 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received unexpected event network-vif-plugged-45db3ebd-1dc8-4c76-999b-2f2a1172317c for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.785 226890 DEBUG nova.network.neutron [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.805 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.808 226890 INFO nova.compute.manager [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Took 1.61 seconds to deallocate network for instance.#033[00m
Jan 20 10:17:22 np0005588920 nova_compute[226886]: 2026-01-20 15:17:22.907 226890 DEBUG nova.compute.manager [req-b87e7ceb-f742-4d6d-8595-ff82195b4d5b req-b1fd6896-39b2-4abb-b746-c9306850fbeb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Received event network-vif-deleted-45db3ebd-1dc8-4c76-999b-2f2a1172317c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.244 226890 INFO nova.compute.manager [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Took 0.44 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.282 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.283 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.368 226890 DEBUG oslo_concurrency.processutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3656409958' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.810 226890 DEBUG oslo_concurrency.processutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.816 226890 DEBUG nova.compute.provider_tree [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.835 226890 DEBUG nova.scheduler.client.report [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.860 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.900 226890 INFO nova.scheduler.client.report [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Deleted allocations for instance 13d313ff-27f8-40d0-96d4-5ddb1605cad9#033[00m
Jan 20 10:17:23 np0005588920 nova_compute[226886]: 2026-01-20 15:17:23.978 226890 DEBUG oslo_concurrency.lockutils [None req-7bc78dea-1a2a-4916-b086-15204906edce e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "13d313ff-27f8-40d0-96d4-5ddb1605cad9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.984s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:24.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:24.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 20 10:17:25 np0005588920 nova_compute[226886]: 2026-01-20 15:17:25.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:17:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2594788816' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:17:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:17:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2594788816' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:17:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:26.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:26.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:27 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 20 10:17:27 np0005588920 nova_compute[226886]: 2026-01-20 15:17:27.806 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:28.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:28.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:29 np0005588920 podman[297186]: 2026-01-20 15:17:29.035892399 +0000 UTC m=+0.117132902 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:17:30 np0005588920 nova_compute[226886]: 2026-01-20 15:17:30.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:30.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:30.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:30 np0005588920 nova_compute[226886]: 2026-01-20 15:17:30.993 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:30.993 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:30.994 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:17:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 20 10:17:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:31.996 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:32.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:32.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:32 np0005588920 nova_compute[226886]: 2026-01-20 15:17:32.824 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e414 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:34.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:34.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:35 np0005588920 nova_compute[226886]: 2026-01-20 15:17:35.238 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922240.2373803, 13d313ff-27f8-40d0-96d4-5ddb1605cad9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:35 np0005588920 nova_compute[226886]: 2026-01-20 15:17:35.239 226890 INFO nova.compute.manager [-] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:17:35 np0005588920 nova_compute[226886]: 2026-01-20 15:17:35.275 226890 DEBUG nova.compute.manager [None req-f91cd023-d059-4b48-814a-1752bd5e1c0d - - - - - -] [instance: 13d313ff-27f8-40d0-96d4-5ddb1605cad9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:35 np0005588920 nova_compute[226886]: 2026-01-20 15:17:35.284 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:36.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:36.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:36 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.241 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.241 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.241 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.241 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.242 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.243 226890 INFO nova.compute.manager [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Terminating instance#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.243 226890 DEBUG nova.compute.manager [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:17:37 np0005588920 kernel: tapb8bc07e2-c8 (unregistering): left promiscuous mode
Jan 20 10:17:37 np0005588920 NetworkManager[49076]: <info>  [1768922257.3086] device (tapb8bc07e2-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:17:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:37Z|00887|binding|INFO|Releasing lport b8bc07e2-c826-408c-a1a5-f45ad76b5888 from this chassis (sb_readonly=0)
Jan 20 10:17:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:37Z|00888|binding|INFO|Setting lport b8bc07e2-c826-408c-a1a5-f45ad76b5888 down in Southbound
Jan 20 10:17:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:37Z|00889|binding|INFO|Removing iface tapb8bc07e2-c8 ovn-installed in OVS
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.318 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.332 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.335 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:5d:51 10.100.0.6'], port_security=['fa:16:3e:ce:5d:51 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '54a13784-2a60-4b16-8208-d9b9d0e3033e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=b8bc07e2-c826-408c-a1a5-f45ad76b5888) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.336 144128 INFO neutron.agent.ovn.metadata.agent [-] Port b8bc07e2-c826-408c-a1a5-f45ad76b5888 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.337 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.356 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb2a3b4-30a3-4e0b-8ff1-fa8b5665588f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 20 10:17:37 np0005588920 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ba.scope: Consumed 21.078s CPU time.
Jan 20 10:17:37 np0005588920 systemd-machined[196121]: Machine qemu-89-instance-000000ba terminated.
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.389 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[95c6b316-c4ed-4040-9c16-a2e6c3e375da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.392 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[af08a647-e165-4702-8997-28bd88333d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.420 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[a66dfd64-7f49-4493-9b8c-25b326c805a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.436 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2d402e10-1332-4b98-8d58-346d6ba053fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706770, 'reachable_time': 35529, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297224, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.452 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aa485f40-3882-4046-a2fe-226c3258938e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706780, 'tstamp': 706780}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297225, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1f4a971-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706783, 'tstamp': 706783}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297225, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.454 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.455 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.460 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.461 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.462 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.462 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:37 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:37.463 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.483 226890 INFO nova.virt.libvirt.driver [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Instance destroyed successfully.#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.484 226890 DEBUG nova.objects.instance [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'resources' on Instance uuid 54a13784-2a60-4b16-8208-d9b9d0e3033e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.505 226890 DEBUG nova.virt.libvirt.vif [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=186,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-otgoutp6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:14:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=54a13784-2a60-4b16-8208-d9b9d0e3033e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.506 226890 DEBUG nova.network.os_vif_util [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "address": "fa:16:3e:ce:5d:51", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8bc07e2-c8", "ovs_interfaceid": "b8bc07e2-c826-408c-a1a5-f45ad76b5888", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.507 226890 DEBUG nova.network.os_vif_util [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.507 226890 DEBUG os_vif [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.509 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8bc07e2-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.559 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.562 226890 INFO os_vif [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:5d:51,bridge_name='br-int',has_traffic_filtering=True,id=b8bc07e2-c826-408c-a1a5-f45ad76b5888,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8bc07e2-c8')#033[00m
Jan 20 10:17:37 np0005588920 nova_compute[226886]: 2026-01-20 15:17:37.827 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.005 226890 DEBUG nova.compute.manager [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-unplugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.005 226890 DEBUG oslo_concurrency.lockutils [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.006 226890 DEBUG oslo_concurrency.lockutils [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.006 226890 DEBUG oslo_concurrency.lockutils [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.006 226890 DEBUG nova.compute.manager [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] No waiting events found dispatching network-vif-unplugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.007 226890 DEBUG nova.compute.manager [req-745455fe-b291-4cf9-8427-6aa2ddcc17eb req-bbacfecc-5d21-429a-acb4-4f930b5e45ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-unplugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:17:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:38.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:38.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.635 226890 INFO nova.virt.libvirt.driver [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Deleting instance files /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e_del#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.635 226890 INFO nova.virt.libvirt.driver [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Deletion of /var/lib/nova/instances/54a13784-2a60-4b16-8208-d9b9d0e3033e_del complete#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.715 226890 INFO nova.compute.manager [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Took 1.47 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.715 226890 DEBUG oslo.service.loopingcall [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.716 226890 DEBUG nova.compute.manager [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:17:38 np0005588920 nova_compute[226886]: 2026-01-20 15:17:38.716 226890 DEBUG nova.network.neutron [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.450 226890 DEBUG nova.network.neutron [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.468 226890 INFO nova.compute.manager [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Took 0.75 seconds to deallocate network for instance.#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.514 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.515 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.520 226890 DEBUG nova.compute.manager [req-63262883-a912-4ad6-a4fb-fb97faac990d req-3433d5b3-de19-4f58-8aa0-13e626ab2dee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-deleted-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.577 226890 DEBUG oslo_concurrency.processutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:39 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1237641148' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.988 226890 DEBUG oslo_concurrency.processutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:39 np0005588920 nova_compute[226886]: 2026-01-20 15:17:39.993 226890 DEBUG nova.compute.provider_tree [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.009 226890 DEBUG nova.scheduler.client.report [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.032 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.070 226890 INFO nova.scheduler.client.report [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Deleted allocations for instance 54a13784-2a60-4b16-8208-d9b9d0e3033e#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.088 226890 DEBUG nova.compute.manager [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.088 226890 DEBUG oslo_concurrency.lockutils [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.089 226890 DEBUG oslo_concurrency.lockutils [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.089 226890 DEBUG oslo_concurrency.lockutils [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.089 226890 DEBUG nova.compute.manager [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] No waiting events found dispatching network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.089 226890 WARNING nova.compute.manager [req-0980804b-9ce3-4094-b138-f39a41868cc1 req-bab2e77a-1f3b-481a-9756-e26b4749aadd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Received unexpected event network-vif-plugged-b8bc07e2-c826-408c-a1a5-f45ad76b5888 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.165 226890 DEBUG oslo_concurrency.lockutils [None req-84ba954c-08fd-42bf-8574-e1dc3b6b62ff e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "54a13784-2a60-4b16-8208-d9b9d0e3033e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:40.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:40.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.709 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.709 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.710 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.710 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.710 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.711 226890 INFO nova.compute.manager [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Terminating instance#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.713 226890 DEBUG nova.compute.manager [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:17:40 np0005588920 kernel: tap0c07d11d-c0 (unregistering): left promiscuous mode
Jan 20 10:17:40 np0005588920 NetworkManager[49076]: <info>  [1768922260.9153] device (tap0c07d11d-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:17:40 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:40Z|00890|binding|INFO|Releasing lport 0c07d11d-c06a-497a-9dbd-975adce07e97 from this chassis (sb_readonly=0)
Jan 20 10:17:40 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:40Z|00891|binding|INFO|Setting lport 0c07d11d-c06a-497a-9dbd-975adce07e97 down in Southbound
Jan 20 10:17:40 np0005588920 ovn_controller[133971]: 2026-01-20T15:17:40Z|00892|binding|INFO|Removing iface tap0c07d11d-c0 ovn-installed in OVS
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.924 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:40.941 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:d4:e6 10.100.0.14'], port_security=['fa:16:3e:cc:d4:e6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '810e72a9-536d-4214-956b-9d5216cce8ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0c07d11d-c06a-497a-9dbd-975adce07e97) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:40.943 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0c07d11d-c06a-497a-9dbd-975adce07e97 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:17:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:40.944 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:17:40 np0005588920 nova_compute[226886]: 2026-01-20 15:17:40.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:40.945 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f79d6726-9c10-4eb7-a4e4-ca35cff4fc16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:40.946 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab namespace which is not needed anymore#033[00m
Jan 20 10:17:40 np0005588920 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Jan 20 10:17:40 np0005588920 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b8.scope: Consumed 22.483s CPU time.
Jan 20 10:17:41 np0005588920 systemd-machined[196121]: Machine qemu-88-instance-000000b8 terminated.
Jan 20 10:17:41 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [NOTICE]   (294912) : haproxy version is 2.8.14-c23fe91
Jan 20 10:17:41 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [NOTICE]   (294912) : path to executable is /usr/sbin/haproxy
Jan 20 10:17:41 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [WARNING]  (294912) : Exiting Master process...
Jan 20 10:17:41 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [ALERT]    (294912) : Current worker (294914) exited with code 143 (Terminated)
Jan 20 10:17:41 np0005588920 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[294907]: [WARNING]  (294912) : All workers exited. Exiting... (0)
Jan 20 10:17:41 np0005588920 systemd[1]: libpod-83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58.scope: Deactivated successfully.
Jan 20 10:17:41 np0005588920 podman[297303]: 2026-01-20 15:17:41.081816871 +0000 UTC m=+0.052304880 container died 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:17:41 np0005588920 podman[297289]: 2026-01-20 15:17:41.091754721 +0000 UTC m=+0.069942757 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:17:41 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58-userdata-shm.mount: Deactivated successfully.
Jan 20 10:17:41 np0005588920 systemd[1]: var-lib-containers-storage-overlay-45b4727d270238dc26ecc0e37388a7172e6bea5024b08b3011ccd3531d437d94-merged.mount: Deactivated successfully.
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.149 226890 INFO nova.virt.libvirt.driver [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Instance destroyed successfully.#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.151 226890 DEBUG nova.objects.instance [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'resources' on Instance uuid 810e72a9-536d-4214-956b-9d5216cce8ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:41 np0005588920 podman[297303]: 2026-01-20 15:17:41.153557488 +0000 UTC m=+0.124045497 container cleanup 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:17:41 np0005588920 systemd[1]: libpod-conmon-83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58.scope: Deactivated successfully.
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.165 226890 DEBUG nova.virt.libvirt.vif [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-0',id=184,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:13:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-fpkdlq3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:13:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=810e72a9-536d-4214-956b-9d5216cce8ff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.166 226890 DEBUG nova.network.os_vif_util [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0c07d11d-c06a-497a-9dbd-975adce07e97", "address": "fa:16:3e:cc:d4:e6", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c07d11d-c0", "ovs_interfaceid": "0c07d11d-c06a-497a-9dbd-975adce07e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.166 226890 DEBUG nova.network.os_vif_util [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.167 226890 DEBUG os_vif [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.168 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.168 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c07d11d-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.170 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.171 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.173 226890 INFO os_vif [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:d4:e6,bridge_name='br-int',has_traffic_filtering=True,id=0c07d11d-c06a-497a-9dbd-975adce07e97,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c07d11d-c0')#033[00m
Jan 20 10:17:41 np0005588920 podman[297361]: 2026-01-20 15:17:41.22436106 +0000 UTC m=+0.045661872 container remove 83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.231 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[272ba797-1d15-423b-be51-241f46627a86]: (4, ('Tue Jan 20 03:17:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab (83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58)\n83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58\nTue Jan 20 03:17:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab (83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58)\n83ef6973c2da9c866f75c083cdcdc4b45645ede9ee82bd9af95face55238cc58\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.232 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7a27ee33-c6ab-43f1-af18-fe85b1bcc2c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.233 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:41 np0005588920 kernel: tapc1f4a971-00: left promiscuous mode
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.248 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.250 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[418fdd84-5e76-4fa0-bcc4-f92966d72eb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.267 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7703f7d2-c854-4af4-9454-47776ca01748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.268 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[71c92e6c-cdf7-45f7-8352-d9fe8ac2ee6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.282 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd2bf31-c944-4903-b4eb-c870c2dcc7e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706763, 'reachable_time': 40884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297394, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 systemd[1]: run-netns-ovnmeta\x2dc1f4a971\x2d0bd7\x2d41ce\x2dbdf6\x2d5acb2b1b4bab.mount: Deactivated successfully.
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.284 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:17:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:17:41.285 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed14b18-01a7-4c7a-91b2-6a198c2ae7ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.918 226890 INFO nova.virt.libvirt.driver [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Deleting instance files /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff_del#033[00m
Jan 20 10:17:41 np0005588920 nova_compute[226886]: 2026-01-20 15:17:41.919 226890 INFO nova.virt.libvirt.driver [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Deletion of /var/lib/nova/instances/810e72a9-536d-4214-956b-9d5216cce8ff_del complete#033[00m
Jan 20 10:17:42 np0005588920 nova_compute[226886]: 2026-01-20 15:17:42.089 226890 INFO nova.compute.manager [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Took 1.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:17:42 np0005588920 nova_compute[226886]: 2026-01-20 15:17:42.089 226890 DEBUG oslo.service.loopingcall [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:17:42 np0005588920 nova_compute[226886]: 2026-01-20 15:17:42.090 226890 DEBUG nova.compute.manager [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:17:42 np0005588920 nova_compute[226886]: 2026-01-20 15:17:42.090 226890 DEBUG nova.network.neutron [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:17:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:42.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:42 np0005588920 nova_compute[226886]: 2026-01-20 15:17:42.828 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.703 226890 DEBUG nova.compute.manager [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-unplugged-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.703 226890 DEBUG oslo_concurrency.lockutils [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.704 226890 DEBUG oslo_concurrency.lockutils [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.704 226890 DEBUG oslo_concurrency.lockutils [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.704 226890 DEBUG nova.compute.manager [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] No waiting events found dispatching network-vif-unplugged-0c07d11d-c06a-497a-9dbd-975adce07e97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.704 226890 DEBUG nova.compute.manager [req-7dbfa8a3-0def-4487-bad3-9bff4558b83a req-e8dc969b-8253-4a69-b71a-8f884e092177 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-unplugged-0c07d11d-c06a-497a-9dbd-975adce07e97 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.922 226890 DEBUG nova.network.neutron [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:43 np0005588920 nova_compute[226886]: 2026-01-20 15:17:43.940 226890 INFO nova.compute.manager [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Took 1.85 seconds to deallocate network for instance.#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.144 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.145 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.210 226890 DEBUG oslo_concurrency.processutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.367 226890 DEBUG nova.compute.manager [req-94b68991-8842-432f-be34-2fb9fd26e767 req-1a875c6f-bc06-4662-aa0a-6387039fc2bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-deleted-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:44.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3826035582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.681 226890 DEBUG oslo_concurrency.processutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.687 226890 DEBUG nova.compute.provider_tree [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.724 226890 DEBUG nova.scheduler.client.report [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.757 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.787 226890 INFO nova.scheduler.client.report [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Deleted allocations for instance 810e72a9-536d-4214-956b-9d5216cce8ff#033[00m
Jan 20 10:17:44 np0005588920 nova_compute[226886]: 2026-01-20 15:17:44.868 226890 DEBUG oslo_concurrency.lockutils [None req-345e8b76-cb1a-4c04-85a0-bb883d2826bb e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.815 226890 DEBUG nova.compute.manager [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.815 226890 DEBUG oslo_concurrency.lockutils [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.815 226890 DEBUG oslo_concurrency.lockutils [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.816 226890 DEBUG oslo_concurrency.lockutils [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "810e72a9-536d-4214-956b-9d5216cce8ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.816 226890 DEBUG nova.compute.manager [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] No waiting events found dispatching network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:45 np0005588920 nova_compute[226886]: 2026-01-20 15:17:45.816 226890 WARNING nova.compute.manager [req-4976230b-868e-43af-9934-239e2e21efec req-081575c2-29f2-4d38-a29f-a072214620c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Received unexpected event network-vif-plugged-0c07d11d-c06a-497a-9dbd-975adce07e97 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:17:46 np0005588920 nova_compute[226886]: 2026-01-20 15:17:46.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:46.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:46.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:46 np0005588920 nova_compute[226886]: 2026-01-20 15:17:46.977 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:47 np0005588920 nova_compute[226886]: 2026-01-20 15:17:47.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:47 np0005588920 nova_compute[226886]: 2026-01-20 15:17:47.831 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:48.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:48.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:50.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:50.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:51 np0005588920 nova_compute[226886]: 2026-01-20 15:17:51.219 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:17:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 69K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1716 writes, 8443 keys, 1716 commit groups, 1.0 writes per commit group, ingest: 16.57 MB, 0.03 MB/s#012Interval WAL: 1716 writes, 1716 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     82.0      1.02              0.31        43    0.024       0      0       0.0       0.0#012  L6      1/0   11.13 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9    110.0     93.6      4.40              1.28        42    0.105    293K    22K       0.0       0.0#012 Sum      1/0   11.13 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     89.3     91.4      5.42              1.59        85    0.064    293K    22K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6    104.0    104.0      0.75              0.22        12    0.062     56K   3149       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    110.0     93.6      4.40              1.28        42    0.105    293K    22K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     82.2      1.02              0.31        42    0.024       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.082, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.48 GB write, 0.10 MB/s write, 0.47 GB read, 0.10 MB/s read, 5.4 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 54.48 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000402 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3134,52.32 MB,17.2121%) FilterBlock(85,821.23 KB,0.263811%) IndexBlock(85,1.35 MB,0.444026%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:17:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:52.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:52 np0005588920 nova_compute[226886]: 2026-01-20 15:17:52.483 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922257.4819124, 54a13784-2a60-4b16-8208-d9b9d0e3033e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:52 np0005588920 nova_compute[226886]: 2026-01-20 15:17:52.483 226890 INFO nova.compute.manager [-] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:17:52 np0005588920 nova_compute[226886]: 2026-01-20 15:17:52.504 226890 DEBUG nova.compute.manager [None req-18b2b07b-9a33-4fb0-8a90-5653dad1d35a - - - - - -] [instance: 54a13784-2a60-4b16-8208-d9b9d0e3033e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:52.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:52 np0005588920 nova_compute[226886]: 2026-01-20 15:17:52.833 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:54.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:54.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:56 np0005588920 nova_compute[226886]: 2026-01-20 15:17:56.148 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922261.1463966, 810e72a9-536d-4214-956b-9d5216cce8ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:56 np0005588920 nova_compute[226886]: 2026-01-20 15:17:56.149 226890 INFO nova.compute.manager [-] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:17:56 np0005588920 nova_compute[226886]: 2026-01-20 15:17:56.176 226890 DEBUG nova.compute.manager [None req-94155ca5-bc06-4f7d-8447-9ef34d45c312 - - - - - -] [instance: 810e72a9-536d-4214-956b-9d5216cce8ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:56 np0005588920 nova_compute[226886]: 2026-01-20 15:17:56.269 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:56.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:56.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:57 np0005588920 nova_compute[226886]: 2026-01-20 15:17:57.835 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:58.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:17:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:58.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:00 np0005588920 podman[297420]: 2026-01-20 15:18:00.009253723 +0000 UTC m=+0.098840945 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:18:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:00.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:00.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:00 np0005588920 nova_compute[226886]: 2026-01-20 15:18:00.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:00 np0005588920 nova_compute[226886]: 2026-01-20 15:18:00.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:18:00 np0005588920 nova_compute[226886]: 2026-01-20 15:18:00.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:18:00 np0005588920 nova_compute[226886]: 2026-01-20 15:18:00.773 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.746 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.747 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:18:01 np0005588920 nova_compute[226886]: 2026-01-20 15:18:01.747 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2471681010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.180 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.336 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.337 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4185MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.338 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.338 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.412 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.413 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.428 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:02.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:02.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.836 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1995102232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.885 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:02 np0005588920 nova_compute[226886]: 2026-01-20 15:18:02.890 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:18:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:03 np0005588920 nova_compute[226886]: 2026-01-20 15:18:03.040 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:18:03 np0005588920 nova_compute[226886]: 2026-01-20 15:18:03.122 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:18:03 np0005588920 nova_compute[226886]: 2026-01-20 15:18:03.123 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:04.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:04.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.640 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.641 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.658 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.723 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.724 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.731 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.732 226890 INFO nova.compute.claims [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:18:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:18:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:18:04 np0005588920 nova_compute[226886]: 2026-01-20 15:18:04.823 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2904258732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:05 np0005588920 nova_compute[226886]: 2026-01-20 15:18:05.291 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:05 np0005588920 nova_compute[226886]: 2026-01-20 15:18:05.299 226890 DEBUG nova.compute.provider_tree [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:18:05 np0005588920 nova_compute[226886]: 2026-01-20 15:18:05.556 226890 DEBUG nova.scheduler.client.report [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:18:06 np0005588920 nova_compute[226886]: 2026-01-20 15:18:06.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:06.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:06.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.123 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.123 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.123 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.575 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.576 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.643 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.643 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.659 226890 INFO nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.828 226890 DEBUG nova.policy [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.837 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.877 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.995 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.996 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:18:07 np0005588920 nova_compute[226886]: 2026-01-20 15:18:07.997 226890 INFO nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Creating image(s)#033[00m
Jan 20 10:18:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.023 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.052 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.076 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.080 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.147 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.148 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.149 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.149 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.171 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.174 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.456 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:08.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.537 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.642 226890 DEBUG nova.objects.instance [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid fdf4cbee-a7b3-4d21-8289-63bc1e093b2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.663 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.664 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Ensure instance console log exists: /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.664 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.665 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.665 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:08 np0005588920 nova_compute[226886]: 2026-01-20 15:18:08.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:09 np0005588920 nova_compute[226886]: 2026-01-20 15:18:09.228 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Successfully created port: e51f8101-df80-4071-a8bf-48b9012ee1ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:18:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:10.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:10.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:10 np0005588920 nova_compute[226886]: 2026-01-20 15:18:10.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:11 np0005588920 nova_compute[226886]: 2026-01-20 15:18:11.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:11 np0005588920 nova_compute[226886]: 2026-01-20 15:18:11.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:11 np0005588920 nova_compute[226886]: 2026-01-20 15:18:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:11 np0005588920 podman[297863]: 2026-01-20 15:18:11.98208641 +0000 UTC m=+0.068069705 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:18:12 np0005588920 nova_compute[226886]: 2026-01-20 15:18:12.323 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Successfully updated port: e51f8101-df80-4071-a8bf-48b9012ee1ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:18:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:12.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:12.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:12 np0005588920 nova_compute[226886]: 2026-01-20 15:18:12.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:12 np0005588920 nova_compute[226886]: 2026-01-20 15:18:12.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:18:12 np0005588920 nova_compute[226886]: 2026-01-20 15:18:12.839 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:13 np0005588920 nova_compute[226886]: 2026-01-20 15:18:13.913 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:13 np0005588920 nova_compute[226886]: 2026-01-20 15:18:13.913 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:13 np0005588920 nova_compute[226886]: 2026-01-20 15:18:13.913 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:18:14 np0005588920 nova_compute[226886]: 2026-01-20 15:18:14.004 226890 DEBUG nova.compute.manager [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:14 np0005588920 nova_compute[226886]: 2026-01-20 15:18:14.004 226890 DEBUG nova.compute.manager [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing instance network info cache due to event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:18:14 np0005588920 nova_compute[226886]: 2026-01-20 15:18:14.004 226890 DEBUG oslo_concurrency.lockutils [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:14 np0005588920 nova_compute[226886]: 2026-01-20 15:18:14.230 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:18:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:14.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:14.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.074 226890 DEBUG nova.network.neutron [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.239 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.239 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance network_info: |[{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.240 226890 DEBUG oslo_concurrency.lockutils [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.240 226890 DEBUG nova.network.neutron [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing network info cache for port e51f8101-df80-4071-a8bf-48b9012ee1ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.246 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Start _get_guest_xml network_info=[{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.252 226890 WARNING nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.260 226890 DEBUG nova.virt.libvirt.host [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.261 226890 DEBUG nova.virt.libvirt.host [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.264 226890 DEBUG nova.virt.libvirt.host [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.265 226890 DEBUG nova.virt.libvirt.host [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.266 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.266 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.267 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.267 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.267 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.268 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.268 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.268 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.268 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.269 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.269 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.269 226890 DEBUG nova.virt.hardware [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.272 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:18:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2456508678' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.712 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.739 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:15 np0005588920 nova_compute[226886]: 2026-01-20 15:18:15.744 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:18:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/594431206' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.207 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.209 226890 DEBUG nova.virt.libvirt.vif [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-114489246',display_name='tempest-TestNetworkAdvancedServerOps-server-114489246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-114489246',id=193,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMnDU9FFWaxWuseGHaE+BPe//d38yXfU4+8sisL4CzgLAjb0pfT0BeSjc8Ibnw5FuNSoNKbz5ntbdnobuC7IAhZAPbu7IiK18CspR2Hzrodt5N5pzRYwj2BgGc4qb3t5+Q==',key_name='tempest-TestNetworkAdvancedServerOps-1311184721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-9qwf5c0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:18:07Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=fdf4cbee-a7b3-4d21-8289-63bc1e093b2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.210 226890 DEBUG nova.network.os_vif_util [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.211 226890 DEBUG nova.network.os_vif_util [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.213 226890 DEBUG nova.objects.instance [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid fdf4cbee-a7b3-4d21-8289-63bc1e093b2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.262 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <uuid>fdf4cbee-a7b3-4d21-8289-63bc1e093b2c</uuid>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <name>instance-000000c1</name>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-114489246</nova:name>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:18:15</nova:creationTime>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <nova:port uuid="e51f8101-df80-4071-a8bf-48b9012ee1ee">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="serial">fdf4cbee-a7b3-4d21-8289-63bc1e093b2c</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="uuid">fdf4cbee-a7b3-4d21-8289-63bc1e093b2c</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:0e:9c:3a"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <target dev="tape51f8101-df"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/console.log" append="off"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:18:16 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:18:16 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:18:16 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:18:16 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.263 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Preparing to wait for external event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.264 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.264 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.264 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.265 226890 DEBUG nova.virt.libvirt.vif [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-114489246',display_name='tempest-TestNetworkAdvancedServerOps-server-114489246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-114489246',id=193,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMnDU9FFWaxWuseGHaE+BPe//d38yXfU4+8sisL4CzgLAjb0pfT0BeSjc8Ibnw5FuNSoNKbz5ntbdnobuC7IAhZAPbu7IiK18CspR2Hzrodt5N5pzRYwj2BgGc4qb3t5+Q==',key_name='tempest-TestNetworkAdvancedServerOps-1311184721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-9qwf5c0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:18:07Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=fdf4cbee-a7b3-4d21-8289-63bc1e093b2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.265 226890 DEBUG nova.network.os_vif_util [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.266 226890 DEBUG nova.network.os_vif_util [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.266 226890 DEBUG os_vif [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.267 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.267 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.268 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.272 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape51f8101-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.272 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape51f8101-df, col_values=(('external_ids', {'iface-id': 'e51f8101-df80-4071-a8bf-48b9012ee1ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:9c:3a', 'vm-uuid': 'fdf4cbee-a7b3-4d21-8289-63bc1e093b2c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:16 np0005588920 NetworkManager[49076]: <info>  [1768922296.2755] manager: (tape51f8101-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.282 226890 INFO os_vif [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df')#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.421 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.422 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.422 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:0e:9c:3a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.423 226890 INFO nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Using config drive#033[00m
Jan 20 10:18:16 np0005588920 nova_compute[226886]: 2026-01-20 15:18:16.452 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:16.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:16.479 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:16.479 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:16.479 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:16.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.112 226890 INFO nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Creating config drive at /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.117 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ljq23bw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.251 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ljq23bw" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.280 226890 DEBUG nova.storage.rbd_utils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.284 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.431 226890 DEBUG oslo_concurrency.processutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.432 226890 INFO nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Deleting local config drive /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:18:17 np0005588920 kernel: tape51f8101-df: entered promiscuous mode
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.4854] manager: (tape51f8101-df): new Tun device (/org/freedesktop/NetworkManager/Devices/413)
Jan 20 10:18:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:17Z|00893|binding|INFO|Claiming lport e51f8101-df80-4071-a8bf-48b9012ee1ee for this chassis.
Jan 20 10:18:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:17Z|00894|binding|INFO|e51f8101-df80-4071-a8bf-48b9012ee1ee: Claiming fa:16:3e:0e:9c:3a 10.100.0.7
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.486 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.490 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.507 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:9c:3a 10.100.0.7'], port_security=['fa:16:3e:0e:9c:3a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fdf4cbee-a7b3-4d21-8289-63bc1e093b2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '14b0b321-5bba-46b3-970a-be865093d05e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3de66f6f-5f8c-4d32-824b-e8203a3036b7, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e51f8101-df80-4071-a8bf-48b9012ee1ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.509 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e51f8101-df80-4071-a8bf-48b9012ee1ee in datapath 17f38b81-5055-40c5-bb34-1cecaae3cdc5 bound to our chassis#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.510 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17f38b81-5055-40c5-bb34-1cecaae3cdc5#033[00m
Jan 20 10:18:17 np0005588920 systemd-udevd[298015]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.520 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[205034e3-48ec-49f7-b8be-293155f4defa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.521 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17f38b81-51 in ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.523 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17f38b81-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.523 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9607dde9-45d7-4201-9920-39bad04978b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.524 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c14192e-5e23-4775-b44f-6dab8897d35a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.5314] device (tape51f8101-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.5325] device (tape51f8101-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:18:17 np0005588920 systemd-machined[196121]: New machine qemu-92-instance-000000c1.
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.539 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[87105bd3-87a5-4705-87c2-42059527f4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 systemd[1]: Started Virtual Machine qemu-92-instance-000000c1.
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.550 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.553 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[09aa5bd8-e49a-48c7-bf04-99b78807b512]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.558 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:17Z|00895|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee ovn-installed in OVS
Jan 20 10:18:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:17Z|00896|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee up in Southbound
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.579 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ca52bebb-8620-46e5-b49f-a66c157df60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.584 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a7a3a9-6094-454e-b45d-8d6317abcf47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 systemd-udevd[298021]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.5855] manager: (tap17f38b81-50): new Veth device (/org/freedesktop/NetworkManager/Devices/414)
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.614 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fa5320-bc52-4b10-9885-aeb17362cef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.618 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c539588e-35f4-49bc-a28b-b2d3c1916c57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.6400] device (tap17f38b81-50): carrier: link connected
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.645 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[9236c9b4-1ffe-40e4-b871-1a70eb721b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.664 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[44df7ed2-fdcc-496c-8935-724eab0abb8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17f38b81-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:e2:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 280], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 733932, 'reachable_time': 20156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298050, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.679 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1cca0663-2f9a-43c8-b5d0-46dc9c0d7e43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:e229'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 733932, 'tstamp': 733932}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298051, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.696 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[88888048-f547-4bd8-8b0a-f822c158a9d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17f38b81-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:e2:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 280], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 733932, 'reachable_time': 20156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298052, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.724 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45e8968b-bb83-4665-9be3-70dc68fe78db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.776 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3f516220-337c-406c-9043-68073979d83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.777 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17f38b81-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.778 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.778 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17f38b81-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 NetworkManager[49076]: <info>  [1768922297.7805] manager: (tap17f38b81-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/415)
Jan 20 10:18:17 np0005588920 kernel: tap17f38b81-50: entered promiscuous mode
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.783 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17f38b81-50, col_values=(('external_ids', {'iface-id': 'e996827b-c03f-4383-8e0d-59caa6e43a24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:17Z|00897|binding|INFO|Releasing lport e996827b-c03f-4383-8e0d-59caa6e43a24 from this chassis (sb_readonly=0)
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.785 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.786 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.786 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7b45028f-92dd-4a3c-8022-1b81612a1e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.787 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-17f38b81-5055-40c5-bb34-1cecaae3cdc5
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 17f38b81-5055-40c5-bb34-1cecaae3cdc5
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:18:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:17.788 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'env', 'PROCESS_TAG=haproxy-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17f38b81-5055-40c5-bb34-1cecaae3cdc5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.798 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:17 np0005588920 nova_compute[226886]: 2026-01-20 15:18:17.841 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.105 226890 DEBUG nova.compute.manager [req-07970eaf-8045-4176-90e7-968918b68743 req-05f32835-885b-4682-ab09-1d3a9676bdbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.106 226890 DEBUG oslo_concurrency.lockutils [req-07970eaf-8045-4176-90e7-968918b68743 req-05f32835-885b-4682-ab09-1d3a9676bdbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.106 226890 DEBUG oslo_concurrency.lockutils [req-07970eaf-8045-4176-90e7-968918b68743 req-05f32835-885b-4682-ab09-1d3a9676bdbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.106 226890 DEBUG oslo_concurrency.lockutils [req-07970eaf-8045-4176-90e7-968918b68743 req-05f32835-885b-4682-ab09-1d3a9676bdbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.107 226890 DEBUG nova.compute.manager [req-07970eaf-8045-4176-90e7-968918b68743 req-05f32835-885b-4682-ab09-1d3a9676bdbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Processing event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.108 226890 DEBUG nova.network.neutron [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updated VIF entry in instance network info cache for port e51f8101-df80-4071-a8bf-48b9012ee1ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.108 226890 DEBUG nova.network.neutron [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.124 226890 DEBUG oslo_concurrency.lockutils [req-397572d7-233e-4795-82df-cedbb8fbf770 req-d1b9d7e1-3461-4862-8632-0efb5a17f7ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:18 np0005588920 podman[298082]: 2026-01-20 15:18:18.168032302 +0000 UTC m=+0.055179930 container create 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:18:18 np0005588920 systemd[1]: Started libpod-conmon-8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7.scope.
Jan 20 10:18:18 np0005588920 podman[298082]: 2026-01-20 15:18:18.139772383 +0000 UTC m=+0.026920041 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:18:18 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:18:18 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/477afe8d19c8645ebac010a93647e3e68c0763b194d3dbd6b5306361d36de34d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:18:18 np0005588920 podman[298082]: 2026-01-20 15:18:18.375641461 +0000 UTC m=+0.262789109 container init 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:18:18 np0005588920 podman[298082]: 2026-01-20 15:18:18.382508235 +0000 UTC m=+0.269655903 container start 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:18:18 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [NOTICE]   (298137) : New worker (298143) forked
Jan 20 10:18:18 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [NOTICE]   (298137) : Loading success.
Jan 20 10:18:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:18.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.485 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922298.48493, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.486 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.489 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.493 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.497 226890 INFO nova.virt.libvirt.driver [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance spawned successfully.#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.498 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.505 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.509 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.519 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.520 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.521 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.521 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.522 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.522 226890 DEBUG nova.virt.libvirt.driver [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.527 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.527 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922298.4852405, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.527 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:18:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:18.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.551 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.554 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922298.492721, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.555 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.578 226890 INFO nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Took 10.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.579 226890 DEBUG nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.581 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.587 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.621 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.648 226890 INFO nova.compute.manager [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Took 13.94 seconds to build instance.#033[00m
Jan 20 10:18:18 np0005588920 nova_compute[226886]: 2026-01-20 15:18:18.663 226890 DEBUG oslo_concurrency.lockutils [None req-b1c64692-f0a5-4b26-9faa-9e5ddeef3bc9 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.298 226890 DEBUG nova.compute.manager [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.298 226890 DEBUG oslo_concurrency.lockutils [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.298 226890 DEBUG oslo_concurrency.lockutils [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.299 226890 DEBUG oslo_concurrency.lockutils [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.299 226890 DEBUG nova.compute.manager [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:20 np0005588920 nova_compute[226886]: 2026-01-20 15:18:20.299 226890 WARNING nova.compute.manager [req-d1230a2e-7e18-4d90-8931-e4a255d9f505 req-18cee659-de2e-4b51-9cdb-3b61bbe722c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state active and task_state None.#033[00m
Jan 20 10:18:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:20.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:21 np0005588920 nova_compute[226886]: 2026-01-20 15:18:21.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:22.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:22.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:22 np0005588920 nova_compute[226886]: 2026-01-20 15:18:22.844 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:18:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 69K writes, 276K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.06 MB/s#012Cumulative WAL: 69K writes, 25K syncs, 2.70 writes per sync, written: 0.27 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9113 writes, 33K keys, 9113 commit groups, 1.0 writes per commit group, ingest: 32.98 MB, 0.05 MB/s#012Interval WAL: 9113 writes, 3589 syncs, 2.54 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:18:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:24.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:25 np0005588920 nova_compute[226886]: 2026-01-20 15:18:25.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:25 np0005588920 NetworkManager[49076]: <info>  [1768922305.6394] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Jan 20 10:18:25 np0005588920 NetworkManager[49076]: <info>  [1768922305.6403] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Jan 20 10:18:25 np0005588920 nova_compute[226886]: 2026-01-20 15:18:25.718 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:25Z|00898|binding|INFO|Releasing lport e996827b-c03f-4383-8e0d-59caa6e43a24 from this chassis (sb_readonly=0)
Jan 20 10:18:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:25Z|00899|binding|INFO|Releasing lport e996827b-c03f-4383-8e0d-59caa6e43a24 from this chassis (sb_readonly=0)
Jan 20 10:18:25 np0005588920 nova_compute[226886]: 2026-01-20 15:18:25.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:26.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:26.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.711 226890 DEBUG nova.compute.manager [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.711 226890 DEBUG nova.compute.manager [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing instance network info cache due to event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.711 226890 DEBUG oslo_concurrency.lockutils [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.712 226890 DEBUG oslo_concurrency.lockutils [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:26 np0005588920 nova_compute[226886]: 2026-01-20 15:18:26.712 226890 DEBUG nova.network.neutron [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing network info cache for port e51f8101-df80-4071-a8bf-48b9012ee1ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:18:27 np0005588920 nova_compute[226886]: 2026-01-20 15:18:27.846 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:28.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:28.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:30.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:30.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:31 np0005588920 podman[298155]: 2026-01-20 15:18:31.009270755 +0000 UTC m=+0.093982078 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:18:31 np0005588920 nova_compute[226886]: 2026-01-20 15:18:31.125 226890 DEBUG nova.network.neutron [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updated VIF entry in instance network info cache for port e51f8101-df80-4071-a8bf-48b9012ee1ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:18:31 np0005588920 nova_compute[226886]: 2026-01-20 15:18:31.126 226890 DEBUG nova.network.neutron [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:31 np0005588920 nova_compute[226886]: 2026-01-20 15:18:31.178 226890 DEBUG oslo_concurrency.lockutils [req-6a7eab51-e455-42a2-99a5-c66343e6f246 req-7b441910-49af-42c5-ba74-789eaa28e6ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:31 np0005588920 nova_compute[226886]: 2026-01-20 15:18:31.280 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.912060) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311913160, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 1435, "num_deletes": 258, "total_data_size": 3019498, "memory_usage": 3060048, "flush_reason": "Manual Compaction"}
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311932029, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 1979813, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68900, "largest_seqno": 70330, "table_properties": {"data_size": 1973758, "index_size": 3257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13517, "raw_average_key_size": 20, "raw_value_size": 1961240, "raw_average_value_size": 2905, "num_data_blocks": 144, "num_entries": 675, "num_filter_entries": 675, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922204, "oldest_key_time": 1768922204, "file_creation_time": 1768922311, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 19125 microseconds, and 12608 cpu microseconds.
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.932086) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 1979813 bytes OK
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.932110) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.934282) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.934340) EVENT_LOG_v1 {"time_micros": 1768922311934327, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.934371) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 3012711, prev total WAL file size 3012711, number of live WAL files 2.
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.935823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353230' seq:72057594037927935, type:22 .. '6C6F676D0032373732' seq:0, type:0; will stop at (end)
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(1933KB)], [138(11MB)]
Jan 20 10:18:31 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311935895, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13652778, "oldest_snapshot_seqno": -1}
Jan 20 10:18:31 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:18:31 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:18:32 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:32Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:9c:3a 10.100.0.7
Jan 20 10:18:32 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:32Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:9c:3a 10.100.0.7
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 9396 keys, 13514356 bytes, temperature: kUnknown
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922312037717, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 13514356, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13451196, "index_size": 38535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 247124, "raw_average_key_size": 26, "raw_value_size": 13283837, "raw_average_value_size": 1413, "num_data_blocks": 1478, "num_entries": 9396, "num_filter_entries": 9396, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922311, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.037987) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 13514356 bytes
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.039477) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.0 rd, 132.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.1 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(13.7) write-amplify(6.8) OK, records in: 9929, records dropped: 533 output_compression: NoCompression
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.039497) EVENT_LOG_v1 {"time_micros": 1768922312039488, "job": 88, "event": "compaction_finished", "compaction_time_micros": 101896, "compaction_time_cpu_micros": 59514, "output_level": 6, "num_output_files": 1, "total_output_size": 13514356, "num_input_records": 9929, "num_output_records": 9396, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922312040149, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922312042678, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:31.935739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.042838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.042845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.042847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.042848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:18:32.042852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:32.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:32.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:32 np0005588920 nova_compute[226886]: 2026-01-20 15:18:32.872 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:18:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:34.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:34 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:34.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:36 np0005588920 nova_compute[226886]: 2026-01-20 15:18:36.285 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:36.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:36.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:36.915 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:36 np0005588920 nova_compute[226886]: 2026-01-20 15:18:36.916 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:36.917 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:18:37 np0005588920 nova_compute[226886]: 2026-01-20 15:18:37.912 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.397 226890 INFO nova.compute.manager [None req-71274c6c-0b93-4311-8c01-803da4f1ae05 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Get console output#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.403 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:18:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:38.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:38.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.755 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.755 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.755 226890 INFO nova.compute.manager [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Rebooting instance#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.823 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.824 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:38 np0005588920 nova_compute[226886]: 2026-01-20 15:18:38.824 226890 DEBUG nova.network.neutron [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:18:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:38.919 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:40 np0005588920 nova_compute[226886]: 2026-01-20 15:18:40.713 226890 DEBUG nova.network.neutron [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:40.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:40 np0005588920 nova_compute[226886]: 2026-01-20 15:18:40.736 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:40 np0005588920 nova_compute[226886]: 2026-01-20 15:18:40.737 226890 DEBUG nova.compute.manager [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:41 np0005588920 nova_compute[226886]: 2026-01-20 15:18:41.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:42.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:42.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:42 np0005588920 nova_compute[226886]: 2026-01-20 15:18:42.914 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588920 podman[298185]: 2026-01-20 15:18:42.962353902 +0000 UTC m=+0.046003561 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:18:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:43 np0005588920 kernel: tape51f8101-df (unregistering): left promiscuous mode
Jan 20 10:18:43 np0005588920 NetworkManager[49076]: <info>  [1768922323.1077] device (tape51f8101-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00900|binding|INFO|Releasing lport e51f8101-df80-4071-a8bf-48b9012ee1ee from this chassis (sb_readonly=0)
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00901|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee down in Southbound
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00902|binding|INFO|Removing iface tape51f8101-df ovn-installed in OVS
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.160 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:9c:3a 10.100.0.7'], port_security=['fa:16:3e:0e:9c:3a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fdf4cbee-a7b3-4d21-8289-63bc1e093b2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '14b0b321-5bba-46b3-970a-be865093d05e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3de66f6f-5f8c-4d32-824b-e8203a3036b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e51f8101-df80-4071-a8bf-48b9012ee1ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.161 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e51f8101-df80-4071-a8bf-48b9012ee1ee in datapath 17f38b81-5055-40c5-bb34-1cecaae3cdc5 unbound from our chassis#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.162 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17f38b81-5055-40c5-bb34-1cecaae3cdc5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.164 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d4810c-116d-435f-b0ed-a8a787b5fa19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.165 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 namespace which is not needed anymore#033[00m
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.173 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c1.scope: Deactivated successfully.
Jan 20 10:18:43 np0005588920 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000c1.scope: Consumed 13.758s CPU time.
Jan 20 10:18:43 np0005588920 systemd-machined[196121]: Machine qemu-92-instance-000000c1 terminated.
Jan 20 10:18:43 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [NOTICE]   (298137) : haproxy version is 2.8.14-c23fe91
Jan 20 10:18:43 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [NOTICE]   (298137) : path to executable is /usr/sbin/haproxy
Jan 20 10:18:43 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [WARNING]  (298137) : Exiting Master process...
Jan 20 10:18:43 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [ALERT]    (298137) : Current worker (298143) exited with code 143 (Terminated)
Jan 20 10:18:43 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298096]: [WARNING]  (298137) : All workers exited. Exiting... (0)
Jan 20 10:18:43 np0005588920 systemd[1]: libpod-8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7.scope: Deactivated successfully.
Jan 20 10:18:43 np0005588920 podman[298228]: 2026-01-20 15:18:43.299903824 +0000 UTC m=+0.046179827 container died 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:18:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:18:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay-477afe8d19c8645ebac010a93647e3e68c0763b194d3dbd6b5306361d36de34d-merged.mount: Deactivated successfully.
Jan 20 10:18:43 np0005588920 podman[298228]: 2026-01-20 15:18:43.330049006 +0000 UTC m=+0.076324989 container cleanup 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:18:43 np0005588920 systemd[1]: libpod-conmon-8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7.scope: Deactivated successfully.
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.371 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 podman[298259]: 2026-01-20 15:18:43.404161891 +0000 UTC m=+0.053312788 container remove 8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.410 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4e6f5f-9ed0-46ad-ac20-e89cf079a257]: (4, ('Tue Jan 20 03:18:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 (8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7)\n8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7\nTue Jan 20 03:18:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 (8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7)\n8da641b120036b069b450d6c1e9e335f07a088fc424b2c39f3c39ee2fd6b51b7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.413 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc630e2-67dd-4908-90c5-48f2d332a3ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.414 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17f38b81-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 kernel: tap17f38b81-50: left promiscuous mode
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.433 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.435 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[68f1e269-9971-41cf-99e5-b8c53d81b565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.454 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b24095c3-765d-4558-be5c-d0e44895375d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.456 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7624f04b-8499-47cb-afd8-d2b545008ff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.470 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[efb91551-9dff-4197-a92f-30282355e71e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 733926, 'reachable_time': 43873, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298288, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.473 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.473 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f23fcd-e9a5-4ff7-bf53-cc206a9c2ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 systemd[1]: run-netns-ovnmeta\x2d17f38b81\x2d5055\x2d40c5\x2dbb34\x2d1cecaae3cdc5.mount: Deactivated successfully.
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.875 226890 INFO nova.virt.libvirt.driver [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance shutdown successfully.#033[00m
Jan 20 10:18:43 np0005588920 kernel: tape51f8101-df: entered promiscuous mode
Jan 20 10:18:43 np0005588920 systemd-udevd[298208]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:18:43 np0005588920 NetworkManager[49076]: <info>  [1768922323.9245] manager: (tape51f8101-df): new Tun device (/org/freedesktop/NetworkManager/Devices/418)
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00903|binding|INFO|Claiming lport e51f8101-df80-4071-a8bf-48b9012ee1ee for this chassis.
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00904|binding|INFO|e51f8101-df80-4071-a8bf-48b9012ee1ee: Claiming fa:16:3e:0e:9c:3a 10.100.0.7
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.924 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 NetworkManager[49076]: <info>  [1768922323.9350] device (tape51f8101-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:18:43 np0005588920 NetworkManager[49076]: <info>  [1768922323.9361] device (tape51f8101-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00905|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee ovn-installed in OVS
Jan 20 10:18:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:43Z|00906|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee up in Southbound
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.941 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:9c:3a 10.100.0.7'], port_security=['fa:16:3e:0e:9c:3a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fdf4cbee-a7b3-4d21-8289-63bc1e093b2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '14b0b321-5bba-46b3-970a-be865093d05e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3de66f6f-5f8c-4d32-824b-e8203a3036b7, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e51f8101-df80-4071-a8bf-48b9012ee1ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.941 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.943 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e51f8101-df80-4071-a8bf-48b9012ee1ee in datapath 17f38b81-5055-40c5-bb34-1cecaae3cdc5 bound to our chassis#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.944 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17f38b81-5055-40c5-bb34-1cecaae3cdc5#033[00m
Jan 20 10:18:43 np0005588920 nova_compute[226886]: 2026-01-20 15:18:43.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.956 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a520fc3f-8eba-4395-b4fa-3938a3c3ed7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.957 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17f38b81-51 in ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.959 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17f38b81-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.959 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7a9f30-5168-4d7b-b5a9-f72fc70c2630]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.962 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfd244c-b455-466d-840b-f66da2785d27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 systemd-machined[196121]: New machine qemu-93-instance-000000c1.
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.974 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a45fe167-be43-4d0a-a446-094594dee550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588920 systemd[1]: Started Virtual Machine qemu-93-instance-000000c1.
Jan 20 10:18:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:43.989 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd5503f-652b-44de-9481-c0082d9e5424]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.019 226890 DEBUG nova.compute.manager [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.020 226890 DEBUG oslo_concurrency.lockutils [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.020 226890 DEBUG oslo_concurrency.lockutils [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.020 226890 DEBUG oslo_concurrency.lockutils [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.021 226890 DEBUG nova.compute.manager [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.021 226890 WARNING nova.compute.manager [req-c3a7c639-dfa6-42e0-81ca-e3f2669dd36d req-1ca71054-760d-4b85-becb-f52c56268291 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state active and task_state reboot_started.#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.023 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cdca6ccb-3b72-4053-91b7-ba3e256c6dd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 NetworkManager[49076]: <info>  [1768922324.0311] manager: (tap17f38b81-50): new Veth device (/org/freedesktop/NetworkManager/Devices/419)
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.030 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba59cb9-f70a-42d2-ab90-f640bc06046e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.066 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[ad518745-df1f-49f3-b327-ad2f43406dbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.069 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[091bddd1-41f5-4798-b7c2-13821022ee16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 NetworkManager[49076]: <info>  [1768922324.0931] device (tap17f38b81-50): carrier: link connected
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.099 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[71cfd2c7-5235-486c-865d-dbbc9d6043e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.115 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[561a888c-1191-40bd-a2bd-5085f20168a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17f38b81-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:e2:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736577, 'reachable_time': 16085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298333, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.128 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[96ffa777-8e52-4101-bf1d-803511db1fff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:e229'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736577, 'tstamp': 736577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298334, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.142 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2c08f7-19b9-42ee-a9a8-80cee665dea3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17f38b81-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:e2:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736577, 'reachable_time': 16085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298335, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.168 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d854d1d4-dcf5-48e5-98fc-ae542a66dc18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.216 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[60adf8b2-afb4-4da3-854a-3e41ddc4c54e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.218 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17f38b81-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.218 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.219 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17f38b81-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.264 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588920 kernel: tap17f38b81-50: entered promiscuous mode
Jan 20 10:18:44 np0005588920 NetworkManager[49076]: <info>  [1768922324.2701] manager: (tap17f38b81-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.271 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17f38b81-50, col_values=(('external_ids', {'iface-id': 'e996827b-c03f-4383-8e0d-59caa6e43a24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:44Z|00907|binding|INFO|Releasing lport e996827b-c03f-4383-8e0d-59caa6e43a24 from this chassis (sb_readonly=0)
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.274 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.275 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[213c9d7a-eb2e-4f8f-adfc-6bc632779145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.275 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-17f38b81-5055-40c5-bb34-1cecaae3cdc5
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/17f38b81-5055-40c5-bb34-1cecaae3cdc5.pid.haproxy
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 17f38b81-5055-40c5-bb34-1cecaae3cdc5
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:18:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:18:44.276 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'env', 'PROCESS_TAG=haproxy-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17f38b81-5055-40c5-bb34-1cecaae3cdc5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:18:44 np0005588920 nova_compute[226886]: 2026-01-20 15:18:44.286 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588920 podman[298367]: 2026-01-20 15:18:44.643883852 +0000 UTC m=+0.045475756 container create 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:18:44 np0005588920 systemd[1]: Started libpod-conmon-8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2.scope.
Jan 20 10:18:44 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:18:44 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/663b6785e137b986f95157d77cdb05a9396d263fef74b572357146b5b8fc374b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:18:44 np0005588920 podman[298367]: 2026-01-20 15:18:44.61901975 +0000 UTC m=+0.020611674 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:18:44 np0005588920 podman[298367]: 2026-01-20 15:18:44.727159826 +0000 UTC m=+0.128751760 container init 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 10:18:44 np0005588920 podman[298367]: 2026-01-20 15:18:44.732511808 +0000 UTC m=+0.134103712 container start 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:18:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:44.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:44.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:44 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [NOTICE]   (298386) : New worker (298388) forked
Jan 20 10:18:44 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [NOTICE]   (298386) : Loading success.
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.118 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for fdf4cbee-a7b3-4d21-8289-63bc1e093b2c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.120 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922325.1178668, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.120 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.126 226890 INFO nova.virt.libvirt.driver [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance running successfully.#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.126 226890 INFO nova.virt.libvirt.driver [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance soft rebooted successfully.#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.127 226890 DEBUG nova.compute.manager [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.176 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.180 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.203 226890 DEBUG oslo_concurrency.lockutils [None req-1288071b-2f58-4c49-886a-3ac72c8693a0 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.206 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922325.1206899, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.207 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.239 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:45 np0005588920 nova_compute[226886]: 2026-01-20 15:18:45.244 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.107 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.107 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.107 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.107 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 WARNING nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state active and task_state None.#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.108 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.109 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.109 226890 WARNING nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state active and task_state None.#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.109 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.109 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.109 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.110 226890 DEBUG oslo_concurrency.lockutils [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.110 226890 DEBUG nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.110 226890 WARNING nova.compute.manager [req-9a8e1ebb-5449-4edf-9326-33011fa07973 req-1bf73c08-2e5a-43c8-beab-0686aab007f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state active and task_state None.#033[00m
Jan 20 10:18:46 np0005588920 nova_compute[226886]: 2026-01-20 15:18:46.291 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:46.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:46.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:47 np0005588920 nova_compute[226886]: 2026-01-20 15:18:47.915 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:48.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:50.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:50.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:51 np0005588920 nova_compute[226886]: 2026-01-20 15:18:51.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:52.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:52.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:52 np0005588920 nova_compute[226886]: 2026-01-20 15:18:52.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:54.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:54.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:56 np0005588920 nova_compute[226886]: 2026-01-20 15:18:56.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:56.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:57 np0005588920 nova_compute[226886]: 2026-01-20 15:18:57.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:58 np0005588920 ovn_controller[133971]: 2026-01-20T15:18:58Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:9c:3a 10.100.0.7
Jan 20 10:18:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:58.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:18:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:58.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:00.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:00.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.305 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.756 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.757 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.758 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.758 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:19:01 np0005588920 nova_compute[226886]: 2026-01-20 15:19:01.758 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:02 np0005588920 podman[298443]: 2026-01-20 15:19:02.003515011 +0000 UTC m=+0.082430711 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 20 10:19:02 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:02 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/412186716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.230 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.325 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.326 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.475 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.476 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3968MB free_disk=20.90404510498047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.477 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.477 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.598 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance fdf4cbee-a7b3-4d21-8289-63bc1e093b2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.598 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.599 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.615 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.641 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.641 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.655 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.688 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.739 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:02.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:02 np0005588920 nova_compute[226886]: 2026-01-20 15:19:02.974 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/817734252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:03 np0005588920 nova_compute[226886]: 2026-01-20 15:19:03.169 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:03 np0005588920 nova_compute[226886]: 2026-01-20 15:19:03.175 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:19:03 np0005588920 nova_compute[226886]: 2026-01-20 15:19:03.207 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:19:03 np0005588920 nova_compute[226886]: 2026-01-20 15:19:03.238 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:19:03 np0005588920 nova_compute[226886]: 2026-01-20 15:19:03.239 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.084 226890 INFO nova.compute.manager [None req-10763265-2b02-4422-8b66-d0c3ac3d8630 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Get console output#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.091 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.239 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.239 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.240 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:19:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:04.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:04.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.907 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.908 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.908 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:19:04 np0005588920 nova_compute[226886]: 2026-01-20 15:19:04.909 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fdf4cbee-a7b3-4d21-8289-63bc1e093b2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.529 226890 DEBUG nova.compute.manager [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.530 226890 DEBUG nova.compute.manager [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing instance network info cache due to event network-changed-e51f8101-df80-4071-a8bf-48b9012ee1ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.530 226890 DEBUG oslo_concurrency.lockutils [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.619 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.619 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.619 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.620 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.620 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.621 226890 INFO nova.compute.manager [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Terminating instance#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.623 226890 DEBUG nova.compute.manager [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:19:05 np0005588920 kernel: tape51f8101-df (unregistering): left promiscuous mode
Jan 20 10:19:05 np0005588920 NetworkManager[49076]: <info>  [1768922345.6742] device (tape51f8101-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:19:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:05Z|00908|binding|INFO|Releasing lport e51f8101-df80-4071-a8bf-48b9012ee1ee from this chassis (sb_readonly=0)
Jan 20 10:19:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:05Z|00909|binding|INFO|Setting lport e51f8101-df80-4071-a8bf-48b9012ee1ee down in Southbound
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.683 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:05Z|00910|binding|INFO|Removing iface tape51f8101-df ovn-installed in OVS
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.685 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.693 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:9c:3a 10.100.0.7'], port_security=['fa:16:3e:0e:9c:3a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fdf4cbee-a7b3-4d21-8289-63bc1e093b2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '6', 'neutron:security_group_ids': '14b0b321-5bba-46b3-970a-be865093d05e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3de66f6f-5f8c-4d32-824b-e8203a3036b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=e51f8101-df80-4071-a8bf-48b9012ee1ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.694 144128 INFO neutron.agent.ovn.metadata.agent [-] Port e51f8101-df80-4071-a8bf-48b9012ee1ee in datapath 17f38b81-5055-40c5-bb34-1cecaae3cdc5 unbound from our chassis#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.695 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17f38b81-5055-40c5-bb34-1cecaae3cdc5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.697 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbad3c5-702e-464d-a787-5f51d80b0535]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.697 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 namespace which is not needed anymore#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.703 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c1.scope: Deactivated successfully.
Jan 20 10:19:05 np0005588920 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c1.scope: Consumed 14.728s CPU time.
Jan 20 10:19:05 np0005588920 systemd-machined[196121]: Machine qemu-93-instance-000000c1 terminated.
Jan 20 10:19:05 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [NOTICE]   (298386) : haproxy version is 2.8.14-c23fe91
Jan 20 10:19:05 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [NOTICE]   (298386) : path to executable is /usr/sbin/haproxy
Jan 20 10:19:05 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [WARNING]  (298386) : Exiting Master process...
Jan 20 10:19:05 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [ALERT]    (298386) : Current worker (298388) exited with code 143 (Terminated)
Jan 20 10:19:05 np0005588920 neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5[298382]: [WARNING]  (298386) : All workers exited. Exiting... (0)
Jan 20 10:19:05 np0005588920 systemd[1]: libpod-8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2.scope: Deactivated successfully.
Jan 20 10:19:05 np0005588920 podman[298537]: 2026-01-20 15:19:05.838610315 +0000 UTC m=+0.051793895 container died 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.843 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.850 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.861 226890 INFO nova.virt.libvirt.driver [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Instance destroyed successfully.#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.862 226890 DEBUG nova.objects.instance [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid fdf4cbee-a7b3-4d21-8289-63bc1e093b2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:05 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2-userdata-shm.mount: Deactivated successfully.
Jan 20 10:19:05 np0005588920 systemd[1]: var-lib-containers-storage-overlay-663b6785e137b986f95157d77cdb05a9396d263fef74b572357146b5b8fc374b-merged.mount: Deactivated successfully.
Jan 20 10:19:05 np0005588920 podman[298537]: 2026-01-20 15:19:05.881362713 +0000 UTC m=+0.094546293 container cleanup 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:19:05 np0005588920 systemd[1]: libpod-conmon-8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2.scope: Deactivated successfully.
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.893 226890 DEBUG nova.virt.libvirt.vif [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-114489246',display_name='tempest-TestNetworkAdvancedServerOps-server-114489246',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-114489246',id=193,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMnDU9FFWaxWuseGHaE+BPe//d38yXfU4+8sisL4CzgLAjb0pfT0BeSjc8Ibnw5FuNSoNKbz5ntbdnobuC7IAhZAPbu7IiK18CspR2Hzrodt5N5pzRYwj2BgGc4qb3t5+Q==',key_name='tempest-TestNetworkAdvancedServerOps-1311184721',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:18:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-9qwf5c0s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:18:45Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=fdf4cbee-a7b3-4d21-8289-63bc1e093b2c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.896 226890 DEBUG nova.network.os_vif_util [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.897 226890 DEBUG nova.network.os_vif_util [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.897 226890 DEBUG os_vif [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.899 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.900 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape51f8101-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.901 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.904 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.907 226890 INFO os_vif [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:9c:3a,bridge_name='br-int',has_traffic_filtering=True,id=e51f8101-df80-4071-a8bf-48b9012ee1ee,network=Network(17f38b81-5055-40c5-bb34-1cecaae3cdc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape51f8101-df')#033[00m
Jan 20 10:19:05 np0005588920 podman[298575]: 2026-01-20 15:19:05.953487461 +0000 UTC m=+0.045190568 container remove 8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.959 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2f1cf0-add8-4bb4-95a7-87bd03ff0f0f]: (4, ('Tue Jan 20 03:19:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 (8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2)\n8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2\nTue Jan 20 03:19:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 (8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2)\n8b64dcacda8148454f1a4911216414f8d2a3dfa6ede052ccfea9ce5e88b9b6c2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.960 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b756be8b-42d2-473b-b133-e9e5915e42ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.961 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17f38b81-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 kernel: tap17f38b81-50: left promiscuous mode
Jan 20 10:19:05 np0005588920 nova_compute[226886]: 2026-01-20 15:19:05.978 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.982 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[59c81465-889c-4457-bdb8-7f720fc608dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.995 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd29c4b-fd8f-4218-a504-f3c396e6a7ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:05 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:05.996 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[62efd8ae-4754-42db-aa42-68bb6f0bb0ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:06.012 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f38b0c6-2a3d-40fc-969f-634d8587123b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736570, 'reachable_time': 24358, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298609, 'error': None, 'target': 'ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:06.014 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17f38b81-5055-40c5-bb34-1cecaae3cdc5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:19:06 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:06.014 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[917aa1ea-89fa-4459-9feb-2def29afb6db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:06 np0005588920 systemd[1]: run-netns-ovnmeta\x2d17f38b81\x2d5055\x2d40c5\x2dbb34\x2d1cecaae3cdc5.mount: Deactivated successfully.
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.282 226890 INFO nova.virt.libvirt.driver [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Deleting instance files /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_del#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.283 226890 INFO nova.virt.libvirt.driver [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Deletion of /var/lib/nova/instances/fdf4cbee-a7b3-4d21-8289-63bc1e093b2c_del complete#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.374 226890 INFO nova.compute.manager [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.375 226890 DEBUG oslo.service.loopingcall [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.375 226890 DEBUG nova.compute.manager [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.376 226890 DEBUG nova.network.neutron [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.484 226890 DEBUG nova.compute.manager [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.485 226890 DEBUG oslo_concurrency.lockutils [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.485 226890 DEBUG oslo_concurrency.lockutils [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.485 226890 DEBUG oslo_concurrency.lockutils [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.485 226890 DEBUG nova.compute.manager [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:19:06 np0005588920 nova_compute[226886]: 2026-01-20 15:19:06.486 226890 DEBUG nova.compute.manager [req-58858fe8-e481-4fd1-92e6-8505a932f464 req-c4b924e4-0fad-4172-9f52-fdd20e72d812 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-unplugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:19:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:06.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:06.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.262 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [{"id": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "address": "fa:16:3e:0e:9c:3a", "network": {"id": "17f38b81-5055-40c5-bb34-1cecaae3cdc5", "bridge": "br-int", "label": "tempest-network-smoke--617424207", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape51f8101-df", "ovs_interfaceid": "e51f8101-df80-4071-a8bf-48b9012ee1ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.351 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.351 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.352 226890 DEBUG oslo_concurrency.lockutils [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.353 226890 DEBUG nova.network.neutron [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Refreshing network info cache for port e51f8101-df80-4071-a8bf-48b9012ee1ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.602 226890 DEBUG nova.network.neutron [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.627 226890 INFO nova.compute.manager [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.634 226890 INFO nova.network.neutron [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Port e51f8101-df80-4071-a8bf-48b9012ee1ee from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.635 226890 DEBUG nova.network.neutron [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.673 226890 DEBUG oslo_concurrency.lockutils [req-b05e9f3e-df2a-40f9-afce-95a1e5e039c8 req-733c77dc-fedc-403e-8958-65df83313e80 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.696 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.696 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.755 226890 DEBUG oslo_concurrency.processutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.793 226890 DEBUG nova.compute.manager [req-637d827e-52f8-4077-9b27-52d550c0f91e req-ebbee8b3-dd6b-49a2-afb2-d7c961f53fcc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-deleted-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:07 np0005588920 nova_compute[226886]: 2026-01-20 15:19:07.975 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/750879127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.224 226890 DEBUG oslo_concurrency.processutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.233 226890 DEBUG nova.compute.provider_tree [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.259 226890 DEBUG nova.scheduler.client.report [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.283 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.337 226890 INFO nova.scheduler.client.report [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance fdf4cbee-a7b3-4d21-8289-63bc1e093b2c#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.474 226890 DEBUG oslo_concurrency.lockutils [None req-8b67426c-d043-4e79-8bc0-b10fabdf06d7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.591 226890 DEBUG nova.compute.manager [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.591 226890 DEBUG oslo_concurrency.lockutils [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.592 226890 DEBUG oslo_concurrency.lockutils [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.592 226890 DEBUG oslo_concurrency.lockutils [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdf4cbee-a7b3-4d21-8289-63bc1e093b2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.592 226890 DEBUG nova.compute.manager [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] No waiting events found dispatching network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.593 226890 WARNING nova.compute.manager [req-f23a43ce-215c-4dc1-a5de-deea2dbe1efd req-a5975838-e328-4060-9723-1106dc812bba 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Received unexpected event network-vif-plugged-e51f8101-df80-4071-a8bf-48b9012ee1ee for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:19:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:19:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3200555813' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:19:08 np0005588920 nova_compute[226886]: 2026-01-20 15:19:08.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:08.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:08.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:09 np0005588920 nova_compute[226886]: 2026-01-20 15:19:09.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:10 np0005588920 nova_compute[226886]: 2026-01-20 15:19:10.232 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:10 np0005588920 nova_compute[226886]: 2026-01-20 15:19:10.723 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:10.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:10.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:10 np0005588920 nova_compute[226886]: 2026-01-20 15:19:10.904 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:11 np0005588920 nova_compute[226886]: 2026-01-20 15:19:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:11 np0005588920 nova_compute[226886]: 2026-01-20 15:19:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:19:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:19:12 np0005588920 nova_compute[226886]: 2026-01-20 15:19:12.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:12 np0005588920 nova_compute[226886]: 2026-01-20 15:19:12.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:19:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:12.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:12.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:12 np0005588920 nova_compute[226886]: 2026-01-20 15:19:12.977 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:13 np0005588920 podman[298764]: 2026-01-20 15:19:13.977546091 +0000 UTC m=+0.059714599 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 20 10:19:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:14.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:14.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:15 np0005588920 nova_compute[226886]: 2026-01-20 15:19:15.039 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:15 np0005588920 nova_compute[226886]: 2026-01-20 15:19:15.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:15 np0005588920 nova_compute[226886]: 2026-01-20 15:19:15.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:15 np0005588920 nova_compute[226886]: 2026-01-20 15:19:15.907 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:16.480 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:16.481 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:16.481 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:16.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:16.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:17 np0005588920 nova_compute[226886]: 2026-01-20 15:19:17.980 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:18.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:20.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:20.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:20 np0005588920 nova_compute[226886]: 2026-01-20 15:19:20.859 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922345.8572364, fdf4cbee-a7b3-4d21-8289-63bc1e093b2c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:19:20 np0005588920 nova_compute[226886]: 2026-01-20 15:19:20.859 226890 INFO nova.compute.manager [-] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:19:20 np0005588920 nova_compute[226886]: 2026-01-20 15:19:20.885 226890 DEBUG nova.compute.manager [None req-f5fa7b76-0ada-41d6-8f0b-ce5e031f2c55 - - - - - -] [instance: fdf4cbee-a7b3-4d21-8289-63bc1e093b2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:19:20 np0005588920 nova_compute[226886]: 2026-01-20 15:19:20.910 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:22 np0005588920 nova_compute[226886]: 2026-01-20 15:19:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:22 np0005588920 nova_compute[226886]: 2026-01-20 15:19:22.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:19:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:22.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:22.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:22 np0005588920 nova_compute[226886]: 2026-01-20 15:19:22.809 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:19:22 np0005588920 nova_compute[226886]: 2026-01-20 15:19:22.809 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:23 np0005588920 nova_compute[226886]: 2026-01-20 15:19:23.021 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:23 np0005588920 nova_compute[226886]: 2026-01-20 15:19:23.770 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:23 np0005588920 nova_compute[226886]: 2026-01-20 15:19:23.770 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:19:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:24.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:24.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:25 np0005588920 nova_compute[226886]: 2026-01-20 15:19:25.913 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:26.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:26.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:28 np0005588920 nova_compute[226886]: 2026-01-20 15:19:28.022 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:28.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:28.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:30.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:30.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:30 np0005588920 nova_compute[226886]: 2026-01-20 15:19:30.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:32.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:32.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:33 np0005588920 podman[298836]: 2026-01-20 15:19:33.012743342 +0000 UTC m=+0.100851731 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:19:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:33 np0005588920 nova_compute[226886]: 2026-01-20 15:19:33.023 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:34.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:34.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:35 np0005588920 nova_compute[226886]: 2026-01-20 15:19:35.918 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:36.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:36.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:38 np0005588920 nova_compute[226886]: 2026-01-20 15:19:38.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:38.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:40.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:40 np0005588920 nova_compute[226886]: 2026-01-20 15:19:40.921 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.712 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.712 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.731 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.878 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.879 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.889 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.889 226890 INFO nova.compute.claims [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:19:41 np0005588920 nova_compute[226886]: 2026-01-20 15:19:41.998 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107288519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.420 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.426 226890 DEBUG nova.compute.provider_tree [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.447 226890 DEBUG nova.scheduler.client.report [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.471 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.472 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.559 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.559 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.578 226890 INFO nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.596 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.718 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.719 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.719 226890 INFO nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating image(s)#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.742 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.766 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.790 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.793 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:42.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:42.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.830 226890 DEBUG nova.policy [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.886 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.887 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.888 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.888 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.918 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:42 np0005588920 nova_compute[226886]: 2026-01-20 15:19:42.922 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.071 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.801 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.879s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.866 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.962 226890 DEBUG nova.objects.instance [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.983 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.984 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Ensure instance console log exists: /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.984 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.985 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:43 np0005588920 nova_compute[226886]: 2026-01-20 15:19:43.985 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:44 np0005588920 nova_compute[226886]: 2026-01-20 15:19:44.006 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Successfully created port: 4f0144e3-a50d-4d4b-a5de-1df8e869d27b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:19:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:44.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:44.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:44 np0005588920 podman[299053]: 2026-01-20 15:19:44.962026853 +0000 UTC m=+0.049506990 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:19:45 np0005588920 nova_compute[226886]: 2026-01-20 15:19:45.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:45.197 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:19:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:45.199 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:19:45 np0005588920 nova_compute[226886]: 2026-01-20 15:19:45.924 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:46.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:46.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.030 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Successfully updated port: 4f0144e3-a50d-4d4b-a5de-1df8e869d27b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.062 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.063 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.063 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.230 226890 DEBUG nova.compute.manager [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.231 226890 DEBUG nova.compute.manager [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing instance network info cache due to event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.231 226890 DEBUG oslo_concurrency.lockutils [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:19:47 np0005588920 nova_compute[226886]: 2026-01-20 15:19:47.929 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:19:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:48 np0005588920 nova_compute[226886]: 2026-01-20 15:19:48.073 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:48.201 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:48.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:48.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.298 226890 DEBUG nova.network.neutron [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.353 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.354 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance network_info: |[{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.354 226890 DEBUG oslo_concurrency.lockutils [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.354 226890 DEBUG nova.network.neutron [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.357 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start _get_guest_xml network_info=[{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.362 226890 WARNING nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.367 226890 DEBUG nova.virt.libvirt.host [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.368 226890 DEBUG nova.virt.libvirt.host [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.371 226890 DEBUG nova.virt.libvirt.host [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.371 226890 DEBUG nova.virt.libvirt.host [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.373 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.373 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.373 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.373 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.374 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.374 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.374 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.374 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.374 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.375 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.375 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.375 226890 DEBUG nova.virt.hardware [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.378 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:19:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3438431235' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.784 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.807 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:49 np0005588920 nova_compute[226886]: 2026-01-20 15:19:49.811 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:19:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3654454720' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.224 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.226 226890 DEBUG nova.virt.libvirt.vif [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:19:42Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.227 226890 DEBUG nova.network.os_vif_util [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.228 226890 DEBUG nova.network.os_vif_util [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.229 226890 DEBUG nova.objects.instance [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.255 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <uuid>ab956bf8-a577-4777-ba2b-6d9dc5d035c3</uuid>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <name>instance-000000c4</name>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1243081842</nova:name>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:19:49</nova:creationTime>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <nova:port uuid="4f0144e3-a50d-4d4b-a5de-1df8e869d27b">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="serial">ab956bf8-a577-4777-ba2b-6d9dc5d035c3</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="uuid">ab956bf8-a577-4777-ba2b-6d9dc5d035c3</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:35:53:6a"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <target dev="tap4f0144e3-a5"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/console.log" append="off"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:19:50 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:19:50 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:19:50 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:19:50 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.257 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Preparing to wait for external event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.257 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.258 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.258 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.259 226890 DEBUG nova.virt.libvirt.vif [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:19:42Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.259 226890 DEBUG nova.network.os_vif_util [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.260 226890 DEBUG nova.network.os_vif_util [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.260 226890 DEBUG os_vif [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.261 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.261 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.262 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.264 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.264 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f0144e3-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.265 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f0144e3-a5, col_values=(('external_ids', {'iface-id': '4f0144e3-a50d-4d4b-a5de-1df8e869d27b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:53:6a', 'vm-uuid': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:50 np0005588920 NetworkManager[49076]: <info>  [1768922390.2670] manager: (tap4f0144e3-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/421)
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.268 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.273 226890 INFO os_vif [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5')#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.376 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.376 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.376 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:35:53:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.377 226890 INFO nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Using config drive#033[00m
Jan 20 10:19:50 np0005588920 nova_compute[226886]: 2026-01-20 15:19:50.401 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:50.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:50.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.202 226890 INFO nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating config drive at /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.206 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp5iaf5z7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.343 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp5iaf5z7" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.381 226890 DEBUG nova.storage.rbd_utils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.387 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.430 226890 DEBUG nova.network.neutron [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updated VIF entry in instance network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.431 226890 DEBUG nova.network.neutron [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:51 np0005588920 nova_compute[226886]: 2026-01-20 15:19:51.459 226890 DEBUG oslo_concurrency.lockutils [req-ac3432e2-899a-4c59-8266-639ff8d1da42 req-6434680b-0fbb-48e3-b41b-6f3e9d284fc1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:19:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:52.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:52.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:53 np0005588920 nova_compute[226886]: 2026-01-20 15:19:53.122 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:53 np0005588920 nova_compute[226886]: 2026-01-20 15:19:53.966 226890 DEBUG oslo_concurrency.processutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:53 np0005588920 nova_compute[226886]: 2026-01-20 15:19:53.967 226890 INFO nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deleting local config drive /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config because it was imported into RBD.#033[00m
Jan 20 10:19:54 np0005588920 kernel: tap4f0144e3-a5: entered promiscuous mode
Jan 20 10:19:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:54Z|00911|binding|INFO|Claiming lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b for this chassis.
Jan 20 10:19:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:54Z|00912|binding|INFO|4f0144e3-a50d-4d4b-a5de-1df8e869d27b: Claiming fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.0206] manager: (tap4f0144e3-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/422)
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.024 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.026 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 systemd-machined[196121]: New machine qemu-94-instance-000000c4.
Jan 20 10:19:54 np0005588920 systemd-udevd[299209]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.0616] device (tap4f0144e3-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.0627] device (tap4f0144e3-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:19:54 np0005588920 systemd[1]: Started Virtual Machine qemu-94-instance-000000c4.
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.087 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:54Z|00913|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b ovn-installed in OVS
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:54Z|00914|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b up in Southbound
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.245 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:53:6a 10.100.0.13'], port_security=['fa:16:3e:35:53:6a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c13a27-83c9-4924-96ac-e10cebda58f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b59f953-14ac-473e-b4dc-024834dd332d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18447654-2510-49ce-8582-3111d554be36, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4f0144e3-a50d-4d4b-a5de-1df8e869d27b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.246 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b in datapath 82c13a27-83c9-4924-96ac-e10cebda58f4 bound to our chassis#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.248 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82c13a27-83c9-4924-96ac-e10cebda58f4#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.257 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8654ea8a-bf32-415b-9b42-53691df9cfc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.258 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82c13a27-81 in ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.262 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82c13a27-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.262 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[36eae0ae-5bdb-4907-94d8-36207699e5a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.264 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3c09f9-32bc-493d-80c2-e8e05759a605]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.273 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3984ba-824a-409d-b66c-7d519b2d2c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.289 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8a6025-f67d-4e9a-971d-ec4f5b6b2fab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.321 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[370f9a28-77ac-4a39-8820-f26a5ea10fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.3276] manager: (tap82c13a27-80): new Veth device (/org/freedesktop/NetworkManager/Devices/423)
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.327 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0893cd-b834-4978-bc0e-cd65b764b0ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.360 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7c04f514-4e72-4b57-94d2-1136ddba1b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.364 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e67471-d992-4bbd-9d11-a654ca937af9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.3886] device (tap82c13a27-80): carrier: link connected
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.393 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1615ca81-e5ce-44f0-88a6-66cd380344e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.409 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[214c12b1-097e-4b20-8ffd-37564da0254d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82c13a27-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:ce:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743607, 'reachable_time': 38653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299242, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.423 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[42897888-1eec-43e2-84f2-58873d050462]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:ce23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 743607, 'tstamp': 743607}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299243, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.438 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3a3e1f-bff4-41d9-bdeb-b6435e735fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82c13a27-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:ce:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743607, 'reachable_time': 38653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299244, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.466 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[268de6d2-f5ac-448a-bc32-9a2a718dc477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.529 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[951b3b4c-e3a2-48ea-bbb4-ec557cfd8029]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.530 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82c13a27-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.530 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.531 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82c13a27-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.532 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 kernel: tap82c13a27-80: entered promiscuous mode
Jan 20 10:19:54 np0005588920 NetworkManager[49076]: <info>  [1768922394.5331] manager: (tap82c13a27-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.535 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82c13a27-80, col_values=(('external_ids', {'iface-id': 'c1d1ab2c-11f3-48a0-9554-4aff1d051db4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:19:54Z|00915|binding|INFO|Releasing lport c1d1ab2c-11f3-48a0-9554-4aff1d051db4 from this chassis (sb_readonly=1)
Jan 20 10:19:54 np0005588920 nova_compute[226886]: 2026-01-20 15:19:54.551 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.551 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.552 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[de520789-d610-48dc-92cf-1dffe999f4fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.553 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-82c13a27-83c9-4924-96ac-e10cebda58f4
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 82c13a27-83c9-4924-96ac-e10cebda58f4
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:19:54 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:19:54.554 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'env', 'PROCESS_TAG=haproxy-82c13a27-83c9-4924-96ac-e10cebda58f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82c13a27-83c9-4924-96ac-e10cebda58f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:19:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:54.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:54.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:54 np0005588920 podman[299306]: 2026-01-20 15:19:54.874959944 +0000 UTC m=+0.022217989 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:19:54 np0005588920 podman[299306]: 2026-01-20 15:19:54.972475791 +0000 UTC m=+0.119733816 container create 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:19:55 np0005588920 systemd[1]: Started libpod-conmon-0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef.scope.
Jan 20 10:19:55 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:19:55 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22ca75e4da7e5c9c576cf5621054d82dcd3c7b0436a53a580f31bc41dc324a56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:19:55 np0005588920 podman[299306]: 2026-01-20 15:19:55.042610903 +0000 UTC m=+0.189868948 container init 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:19:55 np0005588920 podman[299306]: 2026-01-20 15:19:55.050068514 +0000 UTC m=+0.197326549 container start 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:19:55 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [NOTICE]   (299333) : New worker (299336) forked
Jan 20 10:19:55 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [NOTICE]   (299333) : Loading success.
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.100 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922395.0995996, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.100 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Started (Lifecycle Event)#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.153 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.158 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922395.099864, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.159 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.191 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.195 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.275 226890 DEBUG nova.compute.manager [req-4a747ae6-63ed-4a6a-bb08-c3ef677f92fe req-670d201a-9c9a-4649-8d3b-d0561d16fd77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.275 226890 DEBUG oslo_concurrency.lockutils [req-4a747ae6-63ed-4a6a-bb08-c3ef677f92fe req-670d201a-9c9a-4649-8d3b-d0561d16fd77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.276 226890 DEBUG oslo_concurrency.lockutils [req-4a747ae6-63ed-4a6a-bb08-c3ef677f92fe req-670d201a-9c9a-4649-8d3b-d0561d16fd77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.276 226890 DEBUG oslo_concurrency.lockutils [req-4a747ae6-63ed-4a6a-bb08-c3ef677f92fe req-670d201a-9c9a-4649-8d3b-d0561d16fd77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.276 226890 DEBUG nova.compute.manager [req-4a747ae6-63ed-4a6a-bb08-c3ef677f92fe req-670d201a-9c9a-4649-8d3b-d0561d16fd77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Processing event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.277 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.280 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.283 226890 INFO nova.virt.libvirt.driver [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance spawned successfully.#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.284 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.292 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.293 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922395.2798145, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.293 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.307 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.307 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.308 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.308 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.309 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.309 226890 DEBUG nova.virt.libvirt.driver [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.344 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.348 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.390 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.415 226890 INFO nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Took 12.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.416 226890 DEBUG nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.509 226890 INFO nova.compute.manager [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Took 13.66 seconds to build instance.#033[00m
Jan 20 10:19:55 np0005588920 nova_compute[226886]: 2026-01-20 15:19:55.554 226890 DEBUG oslo_concurrency.lockutils [None req-1d27c928-080f-439f-bcb7-ef9614831023 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:56.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:56.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.423 226890 DEBUG nova.compute.manager [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.424 226890 DEBUG oslo_concurrency.lockutils [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.424 226890 DEBUG oslo_concurrency.lockutils [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.424 226890 DEBUG oslo_concurrency.lockutils [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.424 226890 DEBUG nova.compute.manager [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:19:57 np0005588920 nova_compute[226886]: 2026-01-20 15:19:57.424 226890 WARNING nova.compute.manager [req-c0c55a8e-1824-4383-8bc8-27502993e57e req-738d4000-1bb0-4e59-84c6-bb377481a5f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state active and task_state None.#033[00m
Jan 20 10:19:58 np0005588920 nova_compute[226886]: 2026-01-20 15:19:58.124 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:58.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:19:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:58.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:20:00 np0005588920 nova_compute[226886]: 2026-01-20 15:20:00.268 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:00.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:00.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:00 np0005588920 nova_compute[226886]: 2026-01-20 15:20:00.919 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:00 np0005588920 NetworkManager[49076]: <info>  [1768922400.9201] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Jan 20 10:20:00 np0005588920 NetworkManager[49076]: <info>  [1768922400.9209] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 20 10:20:00 np0005588920 nova_compute[226886]: 2026-01-20 15:20:00.995 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:00 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:00Z|00916|binding|INFO|Releasing lport c1d1ab2c-11f3-48a0-9554-4aff1d051db4 from this chassis (sb_readonly=0)
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.006 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.920 226890 DEBUG nova.compute.manager [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.921 226890 DEBUG nova.compute.manager [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing instance network info cache due to event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.921 226890 DEBUG oslo_concurrency.lockutils [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.921 226890 DEBUG oslo_concurrency.lockutils [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:01 np0005588920 nova_compute[226886]: 2026-01-20 15:20:01.922 226890 DEBUG nova.network.neutron [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:20:02 np0005588920 nova_compute[226886]: 2026-01-20 15:20:02.763 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:02 np0005588920 nova_compute[226886]: 2026-01-20 15:20:02.764 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:20:02 np0005588920 nova_compute[226886]: 2026-01-20 15:20:02.764 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:20:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:02.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:02.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:02 np0005588920 nova_compute[226886]: 2026-01-20 15:20:02.920 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.343 226890 DEBUG nova.network.neutron [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updated VIF entry in instance network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.343 226890 DEBUG nova.network.neutron [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.376 226890 DEBUG oslo_concurrency.lockutils [req-a51933c3-04e7-413b-8b2c-59ed1810a697 req-95ed1403-978f-4764-b461-937024afecb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.376 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.377 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:20:03 np0005588920 nova_compute[226886]: 2026-01-20 15:20:03.377 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:03 np0005588920 podman[299346]: 2026-01-20 15:20:03.988491967 +0000 UTC m=+0.076114802 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.535 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.551 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.552 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.552 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.576 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.576 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.576 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.577 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:20:04 np0005588920 nova_compute[226886]: 2026-01-20 15:20:04.577 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:04.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:04.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3916580390' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.064 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.127 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.129 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.276 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.277 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3935MB free_disk=20.92178726196289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.277 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.277 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.370 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ab956bf8-a577-4777-ba2b-6d9dc5d035c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.371 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.371 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.408 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/102669052' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.876 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.884 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.903 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.932 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:20:05 np0005588920 nova_compute[226886]: 2026-01-20 15:20:05.934 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:06.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:06.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:08 np0005588920 nova_compute[226886]: 2026-01-20 15:20:08.173 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:08.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:08.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.866393) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408866418, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 1219, "num_deletes": 251, "total_data_size": 2591145, "memory_usage": 2620752, "flush_reason": "Manual Compaction"}
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408891459, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 1698431, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70335, "largest_seqno": 71549, "table_properties": {"data_size": 1693229, "index_size": 2661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11553, "raw_average_key_size": 19, "raw_value_size": 1682677, "raw_average_value_size": 2891, "num_data_blocks": 118, "num_entries": 582, "num_filter_entries": 582, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922312, "oldest_key_time": 1768922312, "file_creation_time": 1768922408, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 25097 microseconds, and 4104 cpu microseconds.
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.891489) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 1698431 bytes OK
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.891504) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.901091) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.901105) EVENT_LOG_v1 {"time_micros": 1768922408901100, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.901121) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 2585313, prev total WAL file size 2585313, number of live WAL files 2.
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.901882) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(1658KB)], [141(12MB)]
Jan 20 10:20:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408901929, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 15212787, "oldest_snapshot_seqno": -1}
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 9463 keys, 13363300 bytes, temperature: kUnknown
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409103444, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 13363300, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13299999, "index_size": 38548, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23685, "raw_key_size": 249215, "raw_average_key_size": 26, "raw_value_size": 13131689, "raw_average_value_size": 1387, "num_data_blocks": 1475, "num_entries": 9463, "num_filter_entries": 9463, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922408, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:20:09 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:09Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:20:09 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:09Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.103717) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 13363300 bytes
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.340304) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 75.5 rd, 66.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.9 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.8) write-amplify(7.9) OK, records in: 9978, records dropped: 515 output_compression: NoCompression
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.340350) EVENT_LOG_v1 {"time_micros": 1768922409340333, "job": 90, "event": "compaction_finished", "compaction_time_micros": 201596, "compaction_time_cpu_micros": 29593, "output_level": 6, "num_output_files": 1, "total_output_size": 13363300, "num_input_records": 9978, "num_output_records": 9463, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409340867, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409343345, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:08.901786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.343425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.343430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.343432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.343433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:20:09.343435) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:10 np0005588920 nova_compute[226886]: 2026-01-20 15:20:10.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:10.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:10.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:11 np0005588920 nova_compute[226886]: 2026-01-20 15:20:11.107 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:11 np0005588920 nova_compute[226886]: 2026-01-20 15:20:11.108 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:11 np0005588920 nova_compute[226886]: 2026-01-20 15:20:11.108 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:11 np0005588920 nova_compute[226886]: 2026-01-20 15:20:11.108 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:11 np0005588920 nova_compute[226886]: 2026-01-20 15:20:11.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:12 np0005588920 nova_compute[226886]: 2026-01-20 15:20:12.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:12 np0005588920 nova_compute[226886]: 2026-01-20 15:20:12.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:12 np0005588920 nova_compute[226886]: 2026-01-20 15:20:12.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:20:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:12.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:12.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:13 np0005588920 nova_compute[226886]: 2026-01-20 15:20:13.175 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:20:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3382626955' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:20:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:20:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3382626955' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:20:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:14.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:20:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:20:15 np0005588920 nova_compute[226886]: 2026-01-20 15:20:15.319 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:15 np0005588920 nova_compute[226886]: 2026-01-20 15:20:15.340 226890 INFO nova.compute.manager [None req-ac728e55-ad3c-48aa-b2d7-b8e7bdb3eea7 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Get console output#033[00m
Jan 20 10:20:15 np0005588920 nova_compute[226886]: 2026-01-20 15:20:15.348 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:20:15 np0005588920 podman[299419]: 2026-01-20 15:20:15.987929655 +0000 UTC m=+0.075177306 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:16.481 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:16.482 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:16.482 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:16 np0005588920 nova_compute[226886]: 2026-01-20 15:20:16.848 226890 INFO nova.compute.manager [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Rebuilding instance#033[00m
Jan 20 10:20:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:16.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:16.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.110 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'trusted_certs' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.128 226890 DEBUG nova.compute.manager [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.192 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_requests' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.216 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.241 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.264 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.289 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:20:17 np0005588920 nova_compute[226886]: 2026-01-20 15:20:17.292 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:20:18 np0005588920 nova_compute[226886]: 2026-01-20 15:20:18.209 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:18.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:18.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:19 np0005588920 kernel: tap4f0144e3-a5 (unregistering): left promiscuous mode
Jan 20 10:20:19 np0005588920 NetworkManager[49076]: <info>  [1768922419.5098] device (tap4f0144e3-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:20:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:20:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:19Z|00917|binding|INFO|Releasing lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b from this chassis (sb_readonly=0)
Jan 20 10:20:19 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:19Z|00918|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b down in Southbound
Jan 20 10:20:19 np0005588920 nova_compute[226886]: 2026-01-20 15:20:19.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:19Z|00919|binding|INFO|Removing iface tap4f0144e3-a5 ovn-installed in OVS
Jan 20 10:20:19 np0005588920 nova_compute[226886]: 2026-01-20 15:20:19.573 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588920 nova_compute[226886]: 2026-01-20 15:20:19.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588920 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c4.scope: Deactivated successfully.
Jan 20 10:20:19 np0005588920 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c4.scope: Consumed 14.261s CPU time.
Jan 20 10:20:19 np0005588920 systemd-machined[196121]: Machine qemu-94-instance-000000c4 terminated.
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.672 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:53:6a 10.100.0.13'], port_security=['fa:16:3e:35:53:6a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c13a27-83c9-4924-96ac-e10cebda58f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b59f953-14ac-473e-b4dc-024834dd332d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18447654-2510-49ce-8582-3111d554be36, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4f0144e3-a50d-4d4b-a5de-1df8e869d27b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.673 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b in datapath 82c13a27-83c9-4924-96ac-e10cebda58f4 unbound from our chassis#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.674 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82c13a27-83c9-4924-96ac-e10cebda58f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.675 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b748c18e-7dc5-4f95-92d1-9a42de95965e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.675 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 namespace which is not needed anymore#033[00m
Jan 20 10:20:19 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [NOTICE]   (299333) : haproxy version is 2.8.14-c23fe91
Jan 20 10:20:19 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [NOTICE]   (299333) : path to executable is /usr/sbin/haproxy
Jan 20 10:20:19 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [WARNING]  (299333) : Exiting Master process...
Jan 20 10:20:19 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [ALERT]    (299333) : Current worker (299336) exited with code 143 (Terminated)
Jan 20 10:20:19 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[299324]: [WARNING]  (299333) : All workers exited. Exiting... (0)
Jan 20 10:20:19 np0005588920 systemd[1]: libpod-0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef.scope: Deactivated successfully.
Jan 20 10:20:19 np0005588920 podman[299710]: 2026-01-20 15:20:19.808836797 +0000 UTC m=+0.046026182 container died 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:20:19 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef-userdata-shm.mount: Deactivated successfully.
Jan 20 10:20:19 np0005588920 systemd[1]: var-lib-containers-storage-overlay-22ca75e4da7e5c9c576cf5621054d82dcd3c7b0436a53a580f31bc41dc324a56-merged.mount: Deactivated successfully.
Jan 20 10:20:19 np0005588920 podman[299710]: 2026-01-20 15:20:19.844565247 +0000 UTC m=+0.081754642 container cleanup 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:20:19 np0005588920 systemd[1]: libpod-conmon-0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef.scope: Deactivated successfully.
Jan 20 10:20:19 np0005588920 podman[299751]: 2026-01-20 15:20:19.905090838 +0000 UTC m=+0.039358673 container remove 0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.911 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82d7c61b-a97b-4fab-baa5-38f74406b655]: (4, ('Tue Jan 20 03:20:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 (0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef)\n0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef\nTue Jan 20 03:20:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 (0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef)\n0de1189be78d23fa0a5a4495657dca02f7bbaeedda2f1d5dd694afbeee55bbef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.914 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9efd908d-0447-4028-9b3f-37cb5bcba15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.915 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82c13a27-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:19 np0005588920 nova_compute[226886]: 2026-01-20 15:20:19.916 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588920 kernel: tap82c13a27-80: left promiscuous mode
Jan 20 10:20:19 np0005588920 nova_compute[226886]: 2026-01-20 15:20:19.931 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.936 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5482fe52-7687-4ba5-a015-370c085e377e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.951 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a6977b-9572-4db7-97ea-517028625a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.952 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1ba8da-f24d-46eb-aa9b-3a10918815da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.966 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ce20275c-274f-463d-8afa-db53c4118bd8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743600, 'reachable_time': 26443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299771, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:19 np0005588920 systemd[1]: run-netns-ovnmeta\x2d82c13a27\x2d83c9\x2d4924\x2d96ac\x2de10cebda58f4.mount: Deactivated successfully.
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.970 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:20:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:19.970 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[c91440a1-b555-4b32-8254-bf53e16b6dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.169 226890 DEBUG nova.compute.manager [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.170 226890 DEBUG oslo_concurrency.lockutils [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.170 226890 DEBUG oslo_concurrency.lockutils [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.171 226890 DEBUG oslo_concurrency.lockutils [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.171 226890 DEBUG nova.compute.manager [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.171 226890 WARNING nova.compute.manager [req-fb8f5e77-b2df-46d7-b663-feb21ccf81f7 req-332c4886-9489-4020-8b20-4aec36c23d18 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state active and task_state rebuilding.#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.308 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.313 226890 INFO nova.virt.libvirt.driver [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance destroyed successfully.#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.319 226890 INFO nova.virt.libvirt.driver [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance destroyed successfully.#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.320 226890 DEBUG nova.virt.libvirt.vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:19:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:20:15Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.320 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.321 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.322 226890 DEBUG os_vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.324 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.324 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f0144e3-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.326 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.329 226890 INFO os_vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5')#033[00m
Jan 20 10:20:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:20:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:20:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:20 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.699 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deleting instance files /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_del#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.700 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deletion of /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_del complete#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.831 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.831 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating image(s)#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.854 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:20.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:20.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.879 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.903 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.907 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.982 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.982 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.983 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:20 np0005588920 nova_compute[226886]: 2026-01-20 15:20:20.983 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.004 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.007 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.034 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.153 226890 WARNING nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.154 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Triggering sync for uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.155 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.155 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.156 226890 INFO nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.156 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.678 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.671s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:21 np0005588920 nova_compute[226886]: 2026-01-20 15:20:21.758 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.117 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.118 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Ensure instance console log exists: /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.118 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.118 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.119 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.121 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start _get_guest_xml network_info=[{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.127 226890 WARNING nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.138 226890 DEBUG nova.virt.libvirt.host [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.139 226890 DEBUG nova.virt.libvirt.host [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.142 226890 DEBUG nova.virt.libvirt.host [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.142 226890 DEBUG nova.virt.libvirt.host [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.143 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.144 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.144 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.144 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.144 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.145 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.146 226890 DEBUG nova.virt.hardware [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.146 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'vcpu_model' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.169 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.262 226890 DEBUG nova.compute.manager [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.263 226890 DEBUG oslo_concurrency.lockutils [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.263 226890 DEBUG oslo_concurrency.lockutils [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.263 226890 DEBUG oslo_concurrency.lockutils [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.263 226890 DEBUG nova.compute.manager [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.263 226890 WARNING nova.compute.manager [req-0ac8400a-19f6-43a4-8974-d0a493346882 req-34f37de8-774c-45fd-991e-b0eccf97deed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 10:20:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:20:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/409901699' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.654 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.686 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:22 np0005588920 nova_compute[226886]: 2026-01-20 15:20:22.689 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:22.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:22.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:23.124 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:23.125 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.175 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:20:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3883154021' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.202 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.203 226890 DEBUG nova.virt.libvirt.vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:19:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:20:20Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.204 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.204 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.207 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <uuid>ab956bf8-a577-4777-ba2b-6d9dc5d035c3</uuid>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <name>instance-000000c4</name>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1243081842</nova:name>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:20:22</nova:creationTime>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <nova:port uuid="4f0144e3-a50d-4d4b-a5de-1df8e869d27b">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="serial">ab956bf8-a577-4777-ba2b-6d9dc5d035c3</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="uuid">ab956bf8-a577-4777-ba2b-6d9dc5d035c3</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:35:53:6a"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <target dev="tap4f0144e3-a5"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/console.log" append="off"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:20:23 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:20:23 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:20:23 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:20:23 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.208 226890 DEBUG nova.virt.libvirt.vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:19:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:20:20Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.209 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.209 226890 DEBUG nova.network.os_vif_util [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.210 226890 DEBUG os_vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.212 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.212 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.214 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f0144e3-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.215 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f0144e3-a5, col_values=(('external_ids', {'iface-id': '4f0144e3-a50d-4d4b-a5de-1df8e869d27b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:53:6a', 'vm-uuid': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 NetworkManager[49076]: <info>  [1768922423.2171] manager: (tap4f0144e3-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.220 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.221 226890 INFO os_vif [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5')#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.274 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.275 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.275 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:35:53:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.276 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Using config drive#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.299 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.327 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'ec2_ids' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.362 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'keypairs' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.718 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Creating config drive at /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.723 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv06qn2ye execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.859 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv06qn2ye" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.898 226890 DEBUG nova.storage.rbd_utils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:23 np0005588920 nova_compute[226886]: 2026-01-20 15:20:23.901 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.055 226890 DEBUG oslo_concurrency.processutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config ab956bf8-a577-4777-ba2b-6d9dc5d035c3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.056 226890 INFO nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deleting local config drive /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3/disk.config because it was imported into RBD.#033[00m
Jan 20 10:20:24 np0005588920 kernel: tap4f0144e3-a5: entered promiscuous mode
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.1076] manager: (tap4f0144e3-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/428)
Jan 20 10:20:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:24Z|00920|binding|INFO|Claiming lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b for this chassis.
Jan 20 10:20:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:24Z|00921|binding|INFO|4f0144e3-a50d-4d4b-a5de-1df8e869d27b: Claiming fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.111 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.116 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:53:6a 10.100.0.13'], port_security=['fa:16:3e:35:53:6a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c13a27-83c9-4924-96ac-e10cebda58f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7b59f953-14ac-473e-b4dc-024834dd332d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18447654-2510-49ce-8582-3111d554be36, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4f0144e3-a50d-4d4b-a5de-1df8e869d27b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.118 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b in datapath 82c13a27-83c9-4924-96ac-e10cebda58f4 bound to our chassis#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.119 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82c13a27-83c9-4924-96ac-e10cebda58f4#033[00m
Jan 20 10:20:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:24Z|00922|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b ovn-installed in OVS
Jan 20 10:20:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:24Z|00923|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b up in Southbound
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.127 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.130 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.130 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd7bac2-6183-4b2e-b2f9-e0977e34c861]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.132 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82c13a27-81 in ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.133 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82c13a27-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.133 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a4711691-2151-4295-a463-a1515e01957d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.134 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[604c6d79-4cc2-4fe2-aa9a-c3633ab4c49c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.145 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[eaecb19e-a3ec-49ec-b769-efc7d5768a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 systemd-udevd[300094]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:20:24 np0005588920 systemd-machined[196121]: New machine qemu-95-instance-000000c4.
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.1584] device (tap4f0144e3-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.1593] device (tap4f0144e3-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.167 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1816cf-d92c-4356-b6b1-5ce4e3aa8a91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 systemd[1]: Started Virtual Machine qemu-95-instance-000000c4.
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.195 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[2312a390-9016-4896-a3a7-f5905e8b0d1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.2442] manager: (tap82c13a27-80): new Veth device (/org/freedesktop/NetworkManager/Devices/429)
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.243 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f214c587-331d-4f49-b04b-2e1eac00ed90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.274 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6fb507-9489-4f7b-8171-a7228ab125cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.276 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[eb70ad69-2747-4609-892f-d35b839e2aa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.2984] device (tap82c13a27-80): carrier: link connected
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.305 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[064b5f69-d989-4d2b-bd3f-0cd80de483f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.322 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c8d008-3555-4270-965c-869747af9155]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82c13a27-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:ce:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 289], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746598, 'reachable_time': 29583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300125, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.338 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[963a527b-482d-4d4b-ba13-1b9db1f16eb6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:ce23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746598, 'tstamp': 746598}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300126, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.359 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f79c2a-71c9-48b4-a391-13585f66e63a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82c13a27-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:ce:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 289], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746598, 'reachable_time': 29583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 300127, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.394 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[889fb9df-4b03-4603-8168-d7d8efd2f80b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.444 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c1ed4f-098b-4bf7-affd-3b1d9a75e6d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.446 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82c13a27-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.447 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.447 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82c13a27-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:24 np0005588920 kernel: tap82c13a27-80: entered promiscuous mode
Jan 20 10:20:24 np0005588920 NetworkManager[49076]: <info>  [1768922424.4517] manager: (tap82c13a27-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.463 226890 DEBUG nova.compute.manager [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.464 226890 DEBUG oslo_concurrency.lockutils [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.465 226890 DEBUG oslo_concurrency.lockutils [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.465 226890 DEBUG oslo_concurrency.lockutils [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.465 226890 DEBUG nova.compute.manager [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.465 226890 WARNING nova.compute.manager [req-cb25079c-aeda-4193-81e0-9b5e7880a762 req-3a7ff731-76a5-43f6-a548-760fc6a53fd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.465 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82c13a27-80, col_values=(('external_ids', {'iface-id': 'c1d1ab2c-11f3-48a0-9554-4aff1d051db4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:24Z|00924|binding|INFO|Releasing lport c1d1ab2c-11f3-48a0-9554-4aff1d051db4 from this chassis (sb_readonly=0)
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.468 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.469 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[93bc0842-a9d7-43d4-bcf0-2fa007706d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.470 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-82c13a27-83c9-4924-96ac-e10cebda58f4
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/82c13a27-83c9-4924-96ac-e10cebda58f4.pid.haproxy
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 82c13a27-83c9-4924-96ac-e10cebda58f4
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:20:24 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:24.470 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'env', 'PROCESS_TAG=haproxy-82c13a27-83c9-4924-96ac-e10cebda58f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82c13a27-83c9-4924-96ac-e10cebda58f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.480 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.670 226890 DEBUG nova.virt.libvirt.host [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Removed pending event for ab956bf8-a577-4777-ba2b-6d9dc5d035c3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.670 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922424.669917, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.671 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.674 226890 DEBUG nova.compute.manager [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.674 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.678 226890 INFO nova.virt.libvirt.driver [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance spawned successfully.#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.678 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.697 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.700 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.706 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.706 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.707 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.707 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.707 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.708 226890 DEBUG nova.virt.libvirt.driver [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.741 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.741 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922424.673902, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.741 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Started (Lifecycle Event)#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.776 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.781 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.785 226890 DEBUG nova.compute.manager [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.811 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:20:24 np0005588920 podman[300199]: 2026-01-20 15:20:24.838577259 +0000 UTC m=+0.049567633 container create 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.856 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.856 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.856 226890 DEBUG nova.objects.instance [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:20:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:24.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:24.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:24 np0005588920 systemd[1]: Started libpod-conmon-435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c.scope.
Jan 20 10:20:24 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:20:24 np0005588920 podman[300199]: 2026-01-20 15:20:24.813235782 +0000 UTC m=+0.024226176 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:20:24 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/113ba012be44705bd12ff639b011e0e19afac245a3f8350b6188f0ad268e74cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:20:24 np0005588920 nova_compute[226886]: 2026-01-20 15:20:24.923 226890 DEBUG oslo_concurrency.lockutils [None req-fffa88a0-be3a-4eae-bfbe-a32d5a1f20b5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:24 np0005588920 podman[300199]: 2026-01-20 15:20:24.927153652 +0000 UTC m=+0.138144056 container init 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:20:24 np0005588920 podman[300199]: 2026-01-20 15:20:24.933846901 +0000 UTC m=+0.144837275 container start 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:20:24 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [NOTICE]   (300218) : New worker (300220) forked
Jan 20 10:20:24 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [NOTICE]   (300218) : Loading success.
Jan 20 10:20:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.552 226890 DEBUG nova.compute.manager [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.553 226890 DEBUG oslo_concurrency.lockutils [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.553 226890 DEBUG oslo_concurrency.lockutils [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.553 226890 DEBUG oslo_concurrency.lockutils [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.553 226890 DEBUG nova.compute.manager [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:26 np0005588920 nova_compute[226886]: 2026-01-20 15:20:26.553 226890 WARNING nova.compute.manager [req-41399ce9-8f73-43f6-9a36-2c5527dabaad req-aa23cef5-487e-4e64-a78f-3686067104c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state active and task_state None.#033[00m
Jan 20 10:20:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:26.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:26.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:28 np0005588920 nova_compute[226886]: 2026-01-20 15:20:28.213 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:28 np0005588920 nova_compute[226886]: 2026-01-20 15:20:28.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:28.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:28.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:30.126 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:30.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:30.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:32.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:32.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:33 np0005588920 nova_compute[226886]: 2026-01-20 15:20:33.215 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:34 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:34 np0005588920 podman[300281]: 2026-01-20 15:20:34.996094762 +0000 UTC m=+0.086507146 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:20:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:36 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:37Z|00126|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:20:37 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:37Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:53:6a 10.100.0.13
Jan 20 10:20:38 np0005588920 nova_compute[226886]: 2026-01-20 15:20:38.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:38 np0005588920 nova_compute[226886]: 2026-01-20 15:20:38.218 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:38.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:38.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:40 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:40.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:40.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:42 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:42.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:42.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:43 np0005588920 nova_compute[226886]: 2026-01-20 15:20:43.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:44 np0005588920 nova_compute[226886]: 2026-01-20 15:20:44.027 226890 INFO nova.compute.manager [None req-89ccdc85-7125-4027-90df-d391fb84ff04 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Get console output#033[00m
Jan 20 10:20:44 np0005588920 nova_compute[226886]: 2026-01-20 15:20:44.034 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:20:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:44 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.151 226890 DEBUG nova.compute.manager [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.151 226890 DEBUG nova.compute.manager [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing instance network info cache due to event network-changed-4f0144e3-a50d-4d4b-a5de-1df8e869d27b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.151 226890 DEBUG oslo_concurrency.lockutils [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.151 226890 DEBUG oslo_concurrency.lockutils [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.151 226890 DEBUG nova.network.neutron [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Refreshing network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.244 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.244 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.244 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.244 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.245 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.246 226890 INFO nova.compute.manager [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Terminating instance#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.246 226890 DEBUG nova.compute.manager [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:20:46 np0005588920 kernel: tap4f0144e3-a5 (unregistering): left promiscuous mode
Jan 20 10:20:46 np0005588920 NetworkManager[49076]: <info>  [1768922446.3036] device (tap4f0144e3-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:20:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:46Z|00925|binding|INFO|Releasing lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b from this chassis (sb_readonly=0)
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.310 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:46Z|00926|binding|INFO|Setting lport 4f0144e3-a50d-4d4b-a5de-1df8e869d27b down in Southbound
Jan 20 10:20:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:20:46Z|00927|binding|INFO|Removing iface tap4f0144e3-a5 ovn-installed in OVS
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.318 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:53:6a 10.100.0.13'], port_security=['fa:16:3e:35:53:6a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ab956bf8-a577-4777-ba2b-6d9dc5d035c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c13a27-83c9-4924-96ac-e10cebda58f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7b59f953-14ac-473e-b4dc-024834dd332d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18447654-2510-49ce-8582-3111d554be36, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=4f0144e3-a50d-4d4b-a5de-1df8e869d27b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.319 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b in datapath 82c13a27-83c9-4924-96ac-e10cebda58f4 unbound from our chassis#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.320 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82c13a27-83c9-4924-96ac-e10cebda58f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.322 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4381aa1f-296e-4650-b9ba-d60036b46a6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.323 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 namespace which is not needed anymore#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000c4.scope: Deactivated successfully.
Jan 20 10:20:46 np0005588920 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000c4.scope: Consumed 13.554s CPU time.
Jan 20 10:20:46 np0005588920 systemd-machined[196121]: Machine qemu-95-instance-000000c4 terminated.
Jan 20 10:20:46 np0005588920 podman[300307]: 2026-01-20 15:20:46.384453546 +0000 UTC m=+0.058273739 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:20:46 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [NOTICE]   (300218) : haproxy version is 2.8.14-c23fe91
Jan 20 10:20:46 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [NOTICE]   (300218) : path to executable is /usr/sbin/haproxy
Jan 20 10:20:46 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [WARNING]  (300218) : Exiting Master process...
Jan 20 10:20:46 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [ALERT]    (300218) : Current worker (300220) exited with code 143 (Terminated)
Jan 20 10:20:46 np0005588920 neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4[300214]: [WARNING]  (300218) : All workers exited. Exiting... (0)
Jan 20 10:20:46 np0005588920 systemd[1]: libpod-435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c.scope: Deactivated successfully.
Jan 20 10:20:46 np0005588920 podman[300349]: 2026-01-20 15:20:46.450963426 +0000 UTC m=+0.040668971 container died 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:20:46 np0005588920 NetworkManager[49076]: <info>  [1768922446.4662] manager: (tap4f0144e3-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/431)
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.473 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 systemd[1]: var-lib-containers-storage-overlay-113ba012be44705bd12ff639b011e0e19afac245a3f8350b6188f0ad268e74cc-merged.mount: Deactivated successfully.
Jan 20 10:20:46 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c-userdata-shm.mount: Deactivated successfully.
Jan 20 10:20:46 np0005588920 podman[300349]: 2026-01-20 15:20:46.482853937 +0000 UTC m=+0.072559472 container cleanup 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.484 226890 INFO nova.virt.libvirt.driver [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Instance destroyed successfully.#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.484 226890 DEBUG nova.objects.instance [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid ab956bf8-a577-4777-ba2b-6d9dc5d035c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:46 np0005588920 systemd[1]: libpod-conmon-435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c.scope: Deactivated successfully.
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.508 226890 DEBUG nova.virt.libvirt.vif [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:19:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1243081842',display_name='tempest-TestNetworkAdvancedServerOps-server-1243081842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1243081842',id=196,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkV5IF/PoEe/efZhnfKLeJldGZu0kKuhfpPH+OgoUHVkAG0n9Dg/cy0touqzeR1Y19Ga3k7oWWkeK3V3PfTtFX8AqRCXGo4UIGa3DLJqI8BlqSEN/MrM2dwEtiVEOyBPg==',key_name='tempest-TestNetworkAdvancedServerOps-339065355',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:20:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-f24dba2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:20:24Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=ab956bf8-a577-4777-ba2b-6d9dc5d035c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.510 226890 DEBUG nova.network.os_vif_util [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.510 226890 DEBUG nova.network.os_vif_util [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.511 226890 DEBUG os_vif [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.514 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f0144e3-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.520 226890 INFO os_vif [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:53:6a,bridge_name='br-int',has_traffic_filtering=True,id=4f0144e3-a50d-4d4b-a5de-1df8e869d27b,network=Network(82c13a27-83c9-4924-96ac-e10cebda58f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f0144e3-a5')#033[00m
Jan 20 10:20:46 np0005588920 podman[300385]: 2026-01-20 15:20:46.55088093 +0000 UTC m=+0.041445863 container remove 435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.556 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e184ba34-401c-44e8-a543-edeed0745ba8]: (4, ('Tue Jan 20 03:20:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 (435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c)\n435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c\nTue Jan 20 03:20:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 (435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c)\n435ac2112293f55bc4321e12051788edd023f88c902afcfbfba466902909548c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.558 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f5309150-8368-4cb0-ba0f-58d92a36d0a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.559 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82c13a27-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:46 np0005588920 kernel: tap82c13a27-80: left promiscuous mode
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.564 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0f5107-b4c0-45ba-b51d-8421fbb189e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.574 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.588 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5c04818d-0c67-4c2a-9027-7eeefa419c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.589 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[afb9aa31-0129-472a-934f-7a8bd77715e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.603 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[04071247-f5a3-4799-a50d-5cd392e4a7a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746587, 'reachable_time': 36643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300418, 'error': None, 'target': 'ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.605 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82c13a27-83c9-4924-96ac-e10cebda58f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:20:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:20:46.605 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[179e470c-4129-4040-be7d-ee4fc0faabef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:46 np0005588920 systemd[1]: run-netns-ovnmeta\x2d82c13a27\x2d83c9\x2d4924\x2d96ac\x2de10cebda58f4.mount: Deactivated successfully.
Jan 20 10:20:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:46.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.915 226890 INFO nova.virt.libvirt.driver [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deleting instance files /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_del#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.916 226890 INFO nova.virt.libvirt.driver [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deletion of /var/lib/nova/instances/ab956bf8-a577-4777-ba2b-6d9dc5d035c3_del complete#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.969 226890 INFO nova.compute.manager [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.969 226890 DEBUG oslo.service.loopingcall [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.970 226890 DEBUG nova.compute.manager [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:20:46 np0005588920 nova_compute[226886]: 2026-01-20 15:20:46.970 226890 DEBUG nova.network.neutron [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.199 226890 DEBUG nova.compute.manager [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.199 226890 DEBUG oslo_concurrency.lockutils [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.199 226890 DEBUG oslo_concurrency.lockutils [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.200 226890 DEBUG oslo_concurrency.lockutils [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.200 226890 DEBUG nova.compute.manager [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:47 np0005588920 nova_compute[226886]: 2026-01-20 15:20:47.200 226890 DEBUG nova.compute.manager [req-46fd1a22-e056-434d-9c3c-64d6141ab5a1 req-e0334a00-cc95-427c-9f5e-a7e4265e73ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-unplugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.221 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.292 226890 DEBUG nova.network.neutron [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.315 226890 INFO nova.compute.manager [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Took 1.35 seconds to deallocate network for instance.#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.375 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.376 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.422 226890 DEBUG nova.compute.manager [req-83748e5e-9dd1-443a-a482-a61cb84e32ca req-4ed23ec5-e4ad-4857-8ac3-1709caebf502 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-deleted-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.445 226890 DEBUG oslo_concurrency.processutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:48 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2647159744' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.934 226890 DEBUG oslo_concurrency.processutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.940 226890 DEBUG nova.compute.provider_tree [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.960 226890 DEBUG nova.scheduler.client.report [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:20:48 np0005588920 nova_compute[226886]: 2026-01-20 15:20:48.994 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.020 226890 INFO nova.scheduler.client.report [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance ab956bf8-a577-4777-ba2b-6d9dc5d035c3#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.090 226890 DEBUG oslo_concurrency.lockutils [None req-636308f1-404d-46c5-9f82-52914b0af26d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.305 226890 DEBUG nova.network.neutron [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updated VIF entry in instance network info cache for port 4f0144e3-a50d-4d4b-a5de-1df8e869d27b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.305 226890 DEBUG nova.network.neutron [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Updating instance_info_cache with network_info: [{"id": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "address": "fa:16:3e:35:53:6a", "network": {"id": "82c13a27-83c9-4924-96ac-e10cebda58f4", "bridge": "br-int", "label": "tempest-network-smoke--271331337", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f0144e3-a5", "ovs_interfaceid": "4f0144e3-a50d-4d4b-a5de-1df8e869d27b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.321 226890 DEBUG nova.compute.manager [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.321 226890 DEBUG oslo_concurrency.lockutils [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.322 226890 DEBUG oslo_concurrency.lockutils [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.322 226890 DEBUG oslo_concurrency.lockutils [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ab956bf8-a577-4777-ba2b-6d9dc5d035c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.322 226890 DEBUG nova.compute.manager [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] No waiting events found dispatching network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.322 226890 WARNING nova.compute.manager [req-8c0f9c6f-1ba5-4306-aa48-adcdf9b7e6ff req-787b6dce-1df6-465c-987f-6029f7520c1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Received unexpected event network-vif-plugged-4f0144e3-a50d-4d4b-a5de-1df8e869d27b for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:20:49 np0005588920 nova_compute[226886]: 2026-01-20 15:20:49.335 226890 DEBUG oslo_concurrency.lockutils [req-5a45be29-d5e3-4a56-8691-9f0f7f80a4b9 req-6c6b14c8-9d14-4eca-9523-2da9a2a4ff0d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ab956bf8-a577-4777-ba2b-6d9dc5d035c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:50.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:50.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:51 np0005588920 nova_compute[226886]: 2026-01-20 15:20:51.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:52.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:53 np0005588920 nova_compute[226886]: 2026-01-20 15:20:53.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:54.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:54.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:55 np0005588920 nova_compute[226886]: 2026-01-20 15:20:55.350 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:55 np0005588920 nova_compute[226886]: 2026-01-20 15:20:55.465 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:56 np0005588920 nova_compute[226886]: 2026-01-20 15:20:56.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:56 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:58 np0005588920 nova_compute[226886]: 2026-01-20 15:20:58.224 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:20:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:20:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:58.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:58 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:00.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:01 np0005588920 nova_compute[226886]: 2026-01-20 15:21:01.482 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922446.4810932, ab956bf8-a577-4777-ba2b-6d9dc5d035c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:21:01 np0005588920 nova_compute[226886]: 2026-01-20 15:21:01.482 226890 INFO nova.compute.manager [-] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:21:01 np0005588920 nova_compute[226886]: 2026-01-20 15:21:01.505 226890 DEBUG nova.compute.manager [None req-01e27a3c-0a2a-4a3e-b789-67fc6752955c - - - - - -] [instance: ab956bf8-a577-4777-ba2b-6d9dc5d035c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:01 np0005588920 nova_compute[226886]: 2026-01-20 15:21:01.524 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.953720) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461953790, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 812, "num_deletes": 250, "total_data_size": 1520801, "memory_usage": 1540808, "flush_reason": "Manual Compaction"}
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461961000, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 697690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 71554, "largest_seqno": 72361, "table_properties": {"data_size": 694314, "index_size": 1219, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 9080, "raw_average_key_size": 21, "raw_value_size": 687182, "raw_average_value_size": 1590, "num_data_blocks": 52, "num_entries": 432, "num_filter_entries": 432, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922409, "oldest_key_time": 1768922409, "file_creation_time": 1768922461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 7321 microseconds, and 3571 cpu microseconds.
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.961044) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 697690 bytes OK
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.961062) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.963901) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.963921) EVENT_LOG_v1 {"time_micros": 1768922461963914, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.963943) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 1516571, prev total WAL file size 1516571, number of live WAL files 2.
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.964815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323537' seq:72057594037927935, type:22 .. '6D6772737461740032353038' seq:0, type:0; will stop at (end)
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(681KB)], [144(12MB)]
Jan 20 10:21:01 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461964915, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 14060990, "oldest_snapshot_seqno": -1}
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 9398 keys, 10497689 bytes, temperature: kUnknown
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462069621, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 10497689, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10438892, "index_size": 34172, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 248051, "raw_average_key_size": 26, "raw_value_size": 10275851, "raw_average_value_size": 1093, "num_data_blocks": 1293, "num_entries": 9398, "num_filter_entries": 9398, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.069906) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 10497689 bytes
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.071265) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.2 rd, 100.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.7 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(35.2) write-amplify(15.0) OK, records in: 9895, records dropped: 497 output_compression: NoCompression
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.071283) EVENT_LOG_v1 {"time_micros": 1768922462071275, "job": 92, "event": "compaction_finished", "compaction_time_micros": 104782, "compaction_time_cpu_micros": 36453, "output_level": 6, "num_output_files": 1, "total_output_size": 10497689, "num_input_records": 9895, "num_output_records": 9398, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462071479, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462073761, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:01.964663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.073806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.073811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.073813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.073814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:21:02.073815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.761 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.761 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:21:02 np0005588920 nova_compute[226886]: 2026-01-20 15:21:02.762 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:02.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:02.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/373551683' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.211 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.373 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.374 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4169MB free_disk=20.942707061767578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.374 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.375 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.504 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.504 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:21:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.542 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:03 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3237900193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:03 np0005588920 nova_compute[226886]: 2026-01-20 15:21:03.995 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:04 np0005588920 nova_compute[226886]: 2026-01-20 15:21:04.001 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:21:04 np0005588920 nova_compute[226886]: 2026-01-20 15:21:04.128 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:21:04 np0005588920 nova_compute[226886]: 2026-01-20 15:21:04.341 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:21:04 np0005588920 nova_compute[226886]: 2026-01-20 15:21:04.342 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:05 np0005588920 nova_compute[226886]: 2026-01-20 15:21:05.342 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:05 np0005588920 nova_compute[226886]: 2026-01-20 15:21:05.343 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:21:05 np0005588920 nova_compute[226886]: 2026-01-20 15:21:05.343 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:21:05 np0005588920 nova_compute[226886]: 2026-01-20 15:21:05.373 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:21:06 np0005588920 podman[300488]: 2026-01-20 15:21:06.024338829 +0000 UTC m=+0.113723735 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 10:21:06 np0005588920 nova_compute[226886]: 2026-01-20 15:21:06.525 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:06.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:06.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:08.142 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:21:08 np0005588920 nova_compute[226886]: 2026-01-20 15:21:08.143 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:08.143 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:21:08 np0005588920 nova_compute[226886]: 2026-01-20 15:21:08.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:08.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:09 np0005588920 nova_compute[226886]: 2026-01-20 15:21:09.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:10.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:11 np0005588920 nova_compute[226886]: 2026-01-20 15:21:11.528 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:11 np0005588920 nova_compute[226886]: 2026-01-20 15:21:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:11 np0005588920 nova_compute[226886]: 2026-01-20 15:21:11.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:11 np0005588920 nova_compute[226886]: 2026-01-20 15:21:11.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:12 np0005588920 nova_compute[226886]: 2026-01-20 15:21:12.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:12.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:12.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:13 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:13.145 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:13 np0005588920 nova_compute[226886]: 2026-01-20 15:21:13.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:14 np0005588920 nova_compute[226886]: 2026-01-20 15:21:14.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:14 np0005588920 nova_compute[226886]: 2026-01-20 15:21:14.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:14 np0005588920 nova_compute[226886]: 2026-01-20 15:21:14.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:21:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:14.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:16.482 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:16 np0005588920 nova_compute[226886]: 2026-01-20 15:21:16.530 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:16.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:16.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:16 np0005588920 podman[300516]: 2026-01-20 15:21:16.961337044 +0000 UTC m=+0.047509824 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:21:18 np0005588920 nova_compute[226886]: 2026-01-20 15:21:18.231 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:18.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:18.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.547 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.547 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.563 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.631 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.632 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.638 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.638 226890 INFO nova.compute.claims [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:19 np0005588920 nova_compute[226886]: 2026-01-20 15:21:19.724 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1618258914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.132 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.138 226890 DEBUG nova.compute.provider_tree [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.158 226890 DEBUG nova.scheduler.client.report [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.184 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.185 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.240 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.241 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.270 226890 INFO nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.287 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.415 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.416 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.416 226890 INFO nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Creating image(s)#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.444 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.481 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.508 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.513 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.594 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.595 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.596 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.597 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.630 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.634 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e6540d15-a33f-4638-b102-8a1629193c18_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.905 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e6540d15-a33f-4638-b102-8a1629193c18_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.271s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:20.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:20.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:20 np0005588920 nova_compute[226886]: 2026-01-20 15:21:20.970 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] resizing rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.059 226890 DEBUG nova.objects.instance [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'migration_context' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.075 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.075 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Ensure instance console log exists: /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.075 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.076 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.076 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.108 226890 DEBUG nova.policy [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd9a8f26b71f4631a387e555e6b18428', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9156c0a9920c4721843416b9a44404f9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.533 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:21 np0005588920 nova_compute[226886]: 2026-01-20 15:21:21.955 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Successfully created port: 31dd33a4-8964-40e6-9bcf-0219c62004df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.881 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Successfully updated port: 31dd33a4-8964-40e6-9bcf-0219c62004df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.896 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.896 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquired lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.897 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:21:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:22.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.990 226890 DEBUG nova.compute.manager [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-changed-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.990 226890 DEBUG nova.compute.manager [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Refreshing instance network info cache due to event network-changed-31dd33a4-8964-40e6-9bcf-0219c62004df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:21:22 np0005588920 nova_compute[226886]: 2026-01-20 15:21:22.990 226890 DEBUG oslo_concurrency.lockutils [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:23 np0005588920 nova_compute[226886]: 2026-01-20 15:21:23.119 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:21:23 np0005588920 nova_compute[226886]: 2026-01-20 15:21:23.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.044 226890 DEBUG nova.network.neutron [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating instance_info_cache with network_info: [{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.061 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Releasing lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.061 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Instance network_info: |[{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.062 226890 DEBUG oslo_concurrency.lockutils [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.062 226890 DEBUG nova.network.neutron [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Refreshing network info cache for port 31dd33a4-8964-40e6-9bcf-0219c62004df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.065 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Start _get_guest_xml network_info=[{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.069 226890 WARNING nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.072 226890 DEBUG nova.virt.libvirt.host [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.073 226890 DEBUG nova.virt.libvirt.host [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.076 226890 DEBUG nova.virt.libvirt.host [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.076 226890 DEBUG nova.virt.libvirt.host [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.078 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.078 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.078 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.079 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.079 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.079 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.079 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.080 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.080 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.080 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.081 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.081 226890 DEBUG nova.virt.hardware [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.083 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:21:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/300643478' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.545 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.575 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:24 np0005588920 nova_compute[226886]: 2026-01-20 15:21:24.580 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:24.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:24.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:21:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3508581801' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.026 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.027 226890 DEBUG nova.virt.libvirt.vif [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1412753994',display_name='tempest-AttachVolumeNegativeTest-server-1412753994',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1412753994',id=199,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJSD0uIE/eeiZTZtRE8MuMreHdWlysBXNmTJ+kR1VbAGP7zrBwpR9A0gwobx0kS0+2sZH/C0UGF0TEjCQgfGD7Qj2a/ny88m5Z02zvhjR09YJ67bFDf6iKxXKjBBzwCZZg==',key_name='tempest-keypair-1535435627',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-utp8rb16',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:21:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=e6540d15-a33f-4638-b102-8a1629193c18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.028 226890 DEBUG nova.network.os_vif_util [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.028 226890 DEBUG nova.network.os_vif_util [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.030 226890 DEBUG nova.objects.instance [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'pci_devices' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.047 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <uuid>e6540d15-a33f-4638-b102-8a1629193c18</uuid>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <name>instance-000000c7</name>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1412753994</nova:name>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:21:24</nova:creationTime>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:user uuid="cd9a8f26b71f4631a387e555e6b18428">tempest-AttachVolumeNegativeTest-1505789262-project-member</nova:user>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:project uuid="9156c0a9920c4721843416b9a44404f9">tempest-AttachVolumeNegativeTest-1505789262</nova:project>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <nova:port uuid="31dd33a4-8964-40e6-9bcf-0219c62004df">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="serial">e6540d15-a33f-4638-b102-8a1629193c18</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="uuid">e6540d15-a33f-4638-b102-8a1629193c18</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e6540d15-a33f-4638-b102-8a1629193c18_disk">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/e6540d15-a33f-4638-b102-8a1629193c18_disk.config">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:6c:87:c3"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <target dev="tap31dd33a4-89"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/console.log" append="off"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:21:25 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:21:25 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:21:25 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:21:25 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.048 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Preparing to wait for external event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.049 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.049 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.049 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.050 226890 DEBUG nova.virt.libvirt.vif [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1412753994',display_name='tempest-AttachVolumeNegativeTest-server-1412753994',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1412753994',id=199,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJSD0uIE/eeiZTZtRE8MuMreHdWlysBXNmTJ+kR1VbAGP7zrBwpR9A0gwobx0kS0+2sZH/C0UGF0TEjCQgfGD7Qj2a/ny88m5Z02zvhjR09YJ67bFDf6iKxXKjBBzwCZZg==',key_name='tempest-keypair-1535435627',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-utp8rb16',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:21:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=e6540d15-a33f-4638-b102-8a1629193c18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.050 226890 DEBUG nova.network.os_vif_util [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.050 226890 DEBUG nova.network.os_vif_util [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.051 226890 DEBUG os_vif [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.052 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.052 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.057 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.058 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31dd33a4-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.058 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31dd33a4-89, col_values=(('external_ids', {'iface-id': '31dd33a4-8964-40e6-9bcf-0219c62004df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:87:c3', 'vm-uuid': 'e6540d15-a33f-4638-b102-8a1629193c18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 NetworkManager[49076]: <info>  [1768922485.0609] manager: (tap31dd33a4-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/432)
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.062 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.065 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.066 226890 INFO os_vif [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89')#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.123 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.123 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.124 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:6c:87:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.124 226890 INFO nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Using config drive#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.149 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.527 226890 INFO nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Creating config drive at /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.533 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptym6rif4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.669 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptym6rif4" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.703 226890 DEBUG nova.storage.rbd_utils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image e6540d15-a33f-4638-b102-8a1629193c18_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.707 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config e6540d15-a33f-4638-b102-8a1629193c18_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.772 226890 DEBUG nova.network.neutron [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updated VIF entry in instance network info cache for port 31dd33a4-8964-40e6-9bcf-0219c62004df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.773 226890 DEBUG nova.network.neutron [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating instance_info_cache with network_info: [{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.787 226890 DEBUG oslo_concurrency.lockutils [req-c795fcfe-715f-4a18-9471-049fab02d40f req-a3788703-c151-4cdc-be3d-a511704a8e9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.882 226890 DEBUG oslo_concurrency.processutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config e6540d15-a33f-4638-b102-8a1629193c18_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.882 226890 INFO nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Deleting local config drive /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18/disk.config because it was imported into RBD.#033[00m
Jan 20 10:21:25 np0005588920 kernel: tap31dd33a4-89: entered promiscuous mode
Jan 20 10:21:25 np0005588920 NetworkManager[49076]: <info>  [1768922485.9309] manager: (tap31dd33a4-89): new Tun device (/org/freedesktop/NetworkManager/Devices/433)
Jan 20 10:21:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:25Z|00928|binding|INFO|Claiming lport 31dd33a4-8964-40e6-9bcf-0219c62004df for this chassis.
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:25Z|00929|binding|INFO|31dd33a4-8964-40e6-9bcf-0219c62004df: Claiming fa:16:3e:6c:87:c3 10.100.0.7
Jan 20 10:21:25 np0005588920 nova_compute[226886]: 2026-01-20 15:21:25.977 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:25.982 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:87:c3 10.100.0.7'], port_security=['fa:16:3e:6c:87:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e6540d15-a33f-4638-b102-8a1629193c18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcc40ab-0e87-4bcf-a965-9808ad5ff106', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=31dd33a4-8964-40e6-9bcf-0219c62004df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:21:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:25.984 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 31dd33a4-8964-40e6-9bcf-0219c62004df in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a bound to our chassis#033[00m
Jan 20 10:21:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:25.985 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a#033[00m
Jan 20 10:21:25 np0005588920 systemd-machined[196121]: New machine qemu-96-instance-000000c7.
Jan 20 10:21:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:25.998 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a666d136-ab6d-4c42-9db9-afd4495c1116]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:25 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:25.998 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap76c2d716-71 in ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.001 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap76c2d716-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.001 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb5cd44-4e3e-4fd5-880a-94cffe81ff96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.002 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae8b63b-7131-4e32-ac05-25a0c2cb5f96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.014 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[363c521a-2ac5-4479-9c72-d3edc67bfba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 systemd[1]: Started Virtual Machine qemu-96-instance-000000c7.
Jan 20 10:21:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:26Z|00930|binding|INFO|Setting lport 31dd33a4-8964-40e6-9bcf-0219c62004df ovn-installed in OVS
Jan 20 10:21:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:26Z|00931|binding|INFO|Setting lport 31dd33a4-8964-40e6-9bcf-0219c62004df up in Southbound
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.041 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6076805d-ddf0-43a5-90b4-91143b82c028]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.040 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588920 systemd-udevd[300937]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:21:26 np0005588920 NetworkManager[49076]: <info>  [1768922486.0609] device (tap31dd33a4-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:21:26 np0005588920 NetworkManager[49076]: <info>  [1768922486.0619] device (tap31dd33a4-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.073 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[dfaf0457-34a9-40ac-8dc6-4ec8ead3116a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 NetworkManager[49076]: <info>  [1768922486.0812] manager: (tap76c2d716-70): new Veth device (/org/freedesktop/NetworkManager/Devices/434)
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.080 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8380207a-551b-4e60-93b2-9ccd916b231c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.123 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[23eac564-7d3d-4751-bbe1-a078a4a08c90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.127 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a2792f-a613-4a7d-8f16-c20c27376ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 NetworkManager[49076]: <info>  [1768922486.1524] device (tap76c2d716-70): carrier: link connected
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.159 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[06662fad-882c-4b6c-a6f4-40ce19962df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.177 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef6d286-2403-4561-b5e1-d3c5154032b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 292], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752783, 'reachable_time': 26154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300992, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.192 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[35c6d7b7-a47e-4ac8-8faa-dec930fa6aef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:44ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 752783, 'tstamp': 752783}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300993, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.208 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[17915318-e1fa-4084-91f2-496e752c203d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 292], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752783, 'reachable_time': 26154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 300994, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.237 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b20522c0-4496-45b6-b251-b6c87f8115a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.250 226890 DEBUG nova.compute.manager [req-f65c0e61-664a-4ba4-b6b5-a1f14307558c req-0c65491a-4388-45d4-87a3-735df7fa653c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.252 226890 DEBUG oslo_concurrency.lockutils [req-f65c0e61-664a-4ba4-b6b5-a1f14307558c req-0c65491a-4388-45d4-87a3-735df7fa653c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.252 226890 DEBUG oslo_concurrency.lockutils [req-f65c0e61-664a-4ba4-b6b5-a1f14307558c req-0c65491a-4388-45d4-87a3-735df7fa653c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.252 226890 DEBUG oslo_concurrency.lockutils [req-f65c0e61-664a-4ba4-b6b5-a1f14307558c req-0c65491a-4388-45d4-87a3-735df7fa653c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.256 226890 DEBUG nova.compute.manager [req-f65c0e61-664a-4ba4-b6b5-a1f14307558c req-0c65491a-4388-45d4-87a3-735df7fa653c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Processing event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.306 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9d3cdf-aba0-4051-b0ef-587ef7ed7d14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76c2d716-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:26 np0005588920 kernel: tap76c2d716-70: entered promiscuous mode
Jan 20 10:21:26 np0005588920 NetworkManager[49076]: <info>  [1768922486.3111] manager: (tap76c2d716-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/435)
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.315 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76c2d716-70, col_values=(('external_ids', {'iface-id': '2c0bba0e-e9b6-4ece-8349-62642b94d91d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.316 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:26Z|00932|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.319 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.320 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5565dcfb-1ca8-4b55-b63a-12ca8305fb57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.321 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:21:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:21:26.322 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'env', 'PROCESS_TAG=haproxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.330 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.452 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922486.451327, e6540d15-a33f-4638-b102-8a1629193c18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.452 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] VM Started (Lifecycle Event)#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.454 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.458 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.461 226890 INFO nova.virt.libvirt.driver [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Instance spawned successfully.#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.462 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.475 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.481 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.484 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.485 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.485 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.486 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.486 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.487 226890 DEBUG nova.virt.libvirt.driver [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.497 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.498 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922486.4525723, e6540d15-a33f-4638-b102-8a1629193c18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.498 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.517 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.521 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922486.4576313, e6540d15-a33f-4638-b102-8a1629193c18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.521 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.539 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.544 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.549 226890 INFO nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Took 6.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.549 226890 DEBUG nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.561 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.602 226890 INFO nova.compute.manager [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Took 7.00 seconds to build instance.#033[00m
Jan 20 10:21:26 np0005588920 nova_compute[226886]: 2026-01-20 15:21:26.619 226890 DEBUG oslo_concurrency.lockutils [None req-49a5c790-837f-45e9-bfc9-d6bcdc7e61b0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:26 np0005588920 podman[301101]: 2026-01-20 15:21:26.696910569 +0000 UTC m=+0.044099847 container create c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:21:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:21:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:21:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:21:26 np0005588920 systemd[1]: Started libpod-conmon-c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9.scope.
Jan 20 10:21:26 np0005588920 podman[301101]: 2026-01-20 15:21:26.672848309 +0000 UTC m=+0.020037707 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:21:26 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:21:26 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/704aa8525b3c16f610c16236a7283a7ee80b019472f2abfe39ea6ea37db1f770/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:21:26 np0005588920 podman[301101]: 2026-01-20 15:21:26.78573199 +0000 UTC m=+0.132921298 container init c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:21:26 np0005588920 podman[301101]: 2026-01-20 15:21:26.791124523 +0000 UTC m=+0.138313801 container start c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:21:26 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [NOTICE]   (301120) : New worker (301122) forked
Jan 20 10:21:26 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [NOTICE]   (301120) : Loading success.
Jan 20 10:21:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:26.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:26.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.233 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.370 226890 DEBUG nova.compute.manager [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.370 226890 DEBUG oslo_concurrency.lockutils [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.371 226890 DEBUG oslo_concurrency.lockutils [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.371 226890 DEBUG oslo_concurrency.lockutils [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.371 226890 DEBUG nova.compute.manager [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] No waiting events found dispatching network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.371 226890 WARNING nova.compute.manager [req-3b578b4d-dc19-40fd-8ccc-b3f166f9bf3b req-f90b683d-239a-450b-8365-879ea491c226 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received unexpected event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df for instance with vm_state active and task_state None.#033[00m
Jan 20 10:21:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.855 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:28 np0005588920 NetworkManager[49076]: <info>  [1768922488.8562] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/436)
Jan 20 10:21:28 np0005588920 NetworkManager[49076]: <info>  [1768922488.8570] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.933 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:28 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:28Z|00933|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:21:28 np0005588920 nova_compute[226886]: 2026-01-20 15:21:28.942 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:28.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:28.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:29 np0005588920 nova_compute[226886]: 2026-01-20 15:21:29.489 226890 DEBUG nova.compute.manager [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-changed-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:29 np0005588920 nova_compute[226886]: 2026-01-20 15:21:29.490 226890 DEBUG nova.compute.manager [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Refreshing instance network info cache due to event network-changed-31dd33a4-8964-40e6-9bcf-0219c62004df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:21:29 np0005588920 nova_compute[226886]: 2026-01-20 15:21:29.490 226890 DEBUG oslo_concurrency.lockutils [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:29 np0005588920 nova_compute[226886]: 2026-01-20 15:21:29.490 226890 DEBUG oslo_concurrency.lockutils [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:21:29 np0005588920 nova_compute[226886]: 2026-01-20 15:21:29.490 226890 DEBUG nova.network.neutron [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Refreshing network info cache for port 31dd33a4-8964-40e6-9bcf-0219c62004df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:21:30 np0005588920 nova_compute[226886]: 2026-01-20 15:21:30.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:30 np0005588920 nova_compute[226886]: 2026-01-20 15:21:30.896 226890 DEBUG nova.network.neutron [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updated VIF entry in instance network info cache for port 31dd33a4-8964-40e6-9bcf-0219c62004df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:21:30 np0005588920 nova_compute[226886]: 2026-01-20 15:21:30.897 226890 DEBUG nova.network.neutron [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating instance_info_cache with network_info: [{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:21:30 np0005588920 nova_compute[226886]: 2026-01-20 15:21:30.913 226890 DEBUG oslo_concurrency.lockutils [req-0bf90047-9b86-4d3c-a23f-1727d53f9c5f req-ac6c896b-8e31-40aa-afe4-3a265b508833 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:21:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:30.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:30.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:32.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:32.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:33 np0005588920 nova_compute[226886]: 2026-01-20 15:21:33.236 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:34.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:34.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:35 np0005588920 nova_compute[226886]: 2026-01-20 15:21:35.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:36.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:36.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:37 np0005588920 podman[301182]: 2026-01-20 15:21:37.030299053 +0000 UTC m=+0.112579043 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:21:38 np0005588920 nova_compute[226886]: 2026-01-20 15:21:38.287 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:38.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:38.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:39 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:39Z|00128|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:87:c3 10.100.0.7
Jan 20 10:21:39 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:39Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:87:c3 10.100.0.7
Jan 20 10:21:40 np0005588920 nova_compute[226886]: 2026-01-20 15:21:40.062 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:40.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:40.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:42.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:42.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:43 np0005588920 nova_compute[226886]: 2026-01-20 15:21:43.289 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:44.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:44.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:45 np0005588920 nova_compute[226886]: 2026-01-20 15:21:45.064 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:46.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:47 np0005588920 podman[301209]: 2026-01-20 15:21:47.955144226 +0000 UTC m=+0.046067333 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:21:48 np0005588920 nova_compute[226886]: 2026-01-20 15:21:48.290 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:48.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:48.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:50 np0005588920 nova_compute[226886]: 2026-01-20 15:21:50.066 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:50.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:52.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:52.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:53 np0005588920 nova_compute[226886]: 2026-01-20 15:21:53.292 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:54.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:54.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:55 np0005588920 nova_compute[226886]: 2026-01-20 15:21:55.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:56.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:56.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:58 np0005588920 nova_compute[226886]: 2026-01-20 15:21:58.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:58 np0005588920 ovn_controller[133971]: 2026-01-20T15:21:58Z|00934|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 20 10:21:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:21:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:59.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:00 np0005588920 nova_compute[226886]: 2026-01-20 15:22:00.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:00.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:01.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:01 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.481 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.482 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.498 226890 DEBUG nova.objects.instance [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.532 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.768 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.769 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.769 226890 INFO nova.compute.manager [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Attaching volume 02088fae-4a9d-4027-a9b3-19d5b159fed0 to /dev/vdb#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.966 226890 DEBUG os_brick.utils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.968 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.981 231437 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.981 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[6276b8f3-97f9-440a-bdca-6067b8427867]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.983 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.991 231437 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.991 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[1fce3c2f-e5b9-41a7-8516-e40558cdc956]: (4, ('InitiatorName=iqn.1994-05.com.redhat:f7e7b2e5d28e', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:02 np0005588920 nova_compute[226886]: 2026-01-20 15:22:02.992 231437 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.001 231437 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.001 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[e7464f06-e00d-4175-bdb9-768cd5bba5a7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:03.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.003 231437 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec934e0-2e60-43c3-944c-587c335106ac]: (4, '1190ec40-2b89-4358-8dec-733c5829fbed') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.004 226890 DEBUG oslo_concurrency.processutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:03.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.042 226890 DEBUG oslo_concurrency.processutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "nvme version" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.044 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.045 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.046 226890 DEBUG os_brick.initiator.connectors.lightos [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.046 226890 DEBUG os_brick.utils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] <== get_connector_properties: return (79ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:f7e7b2e5d28e', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '1190ec40-2b89-4358-8dec-733c5829fbed', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.047 226890 DEBUG nova.virt.block_device [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating existing volume attachment record: 9703280d-0054-47db-b0be-f4797e6f98b3 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.298 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.784 226890 DEBUG nova.objects.instance [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.813 226890 DEBUG nova.virt.libvirt.driver [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Attempting to attach volume 02088fae-4a9d-4027-a9b3-19d5b159fed0 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.817 226890 DEBUG nova.virt.libvirt.guest [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-02088fae-4a9d-4027-a9b3-19d5b159fed0">
Jan 20 10:22:03 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  <auth username="openstack">
Jan 20 10:22:03 np0005588920 nova_compute[226886]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  </auth>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:22:03 np0005588920 nova_compute[226886]:  <serial>02088fae-4a9d-4027-a9b3-19d5b159fed0</serial>
Jan 20 10:22:03 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:22:03 np0005588920 nova_compute[226886]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.924 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.924 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.924 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:22:03 np0005588920 nova_compute[226886]: 2026-01-20 15:22:03.924 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:04 np0005588920 nova_compute[226886]: 2026-01-20 15:22:04.076 226890 DEBUG nova.virt.libvirt.driver [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:22:04 np0005588920 nova_compute[226886]: 2026-01-20 15:22:04.076 226890 DEBUG nova.virt.libvirt.driver [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:22:04 np0005588920 nova_compute[226886]: 2026-01-20 15:22:04.076 226890 DEBUG nova.virt.libvirt.driver [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:22:04 np0005588920 nova_compute[226886]: 2026-01-20 15:22:04.076 226890 DEBUG nova.virt.libvirt.driver [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:6c:87:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:22:04 np0005588920 nova_compute[226886]: 2026-01-20 15:22:04.442 226890 DEBUG oslo_concurrency.lockutils [None req-f23772ed-41be-4d37-b02f-23219ad0d9b8 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:05.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:05.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:05 np0005588920 nova_compute[226886]: 2026-01-20 15:22:05.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.090 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating instance_info_cache with network_info: [{"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.154 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-e6540d15-a33f-4638-b102-8a1629193c18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.154 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.155 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.170 226890 DEBUG oslo_concurrency.lockutils [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.171 226890 DEBUG oslo_concurrency.lockutils [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.190 226890 INFO nova.compute.manager [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Detaching volume 02088fae-4a9d-4027-a9b3-19d5b159fed0#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.193 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.194 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.194 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.194 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.194 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.445 226890 INFO nova.virt.block_device [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Attempting to driver detach volume 02088fae-4a9d-4027-a9b3-19d5b159fed0 from mountpoint /dev/vdb#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.455 226890 DEBUG nova.virt.libvirt.driver [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Attempting to detach device vdb from instance e6540d15-a33f-4638-b102-8a1629193c18 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.456 226890 DEBUG nova.virt.libvirt.guest [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-02088fae-4a9d-4027-a9b3-19d5b159fed0">
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <serial>02088fae-4a9d-4027-a9b3-19d5b159fed0</serial>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:22:06 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.462 226890 INFO nova.virt.libvirt.driver [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance e6540d15-a33f-4638-b102-8a1629193c18 from the persistent domain config.#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.462 226890 DEBUG nova.virt.libvirt.driver [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance e6540d15-a33f-4638-b102-8a1629193c18 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.463 226890 DEBUG nova.virt.libvirt.guest [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <source protocol="rbd" name="volumes/volume-02088fae-4a9d-4027-a9b3-19d5b159fed0">
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  </source>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <serial>02088fae-4a9d-4027-a9b3-19d5b159fed0</serial>
Jan 20 10:22:06 np0005588920 nova_compute[226886]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:22:06 np0005588920 nova_compute[226886]: </disk>
Jan 20 10:22:06 np0005588920 nova_compute[226886]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.509 226890 DEBUG nova.virt.libvirt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Received event <DeviceRemovedEvent: 1768922526.509015, e6540d15-a33f-4638-b102-8a1629193c18 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.510 226890 DEBUG nova.virt.libvirt.driver [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance e6540d15-a33f-4638-b102-8a1629193c18 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.512 226890 INFO nova.virt.libvirt.driver [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance e6540d15-a33f-4638-b102-8a1629193c18 from the live domain config.#033[00m
Jan 20 10:22:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4076641553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.644 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.918 226890 DEBUG nova.objects.instance [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.937 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:22:06 np0005588920 nova_compute[226886]: 2026-01-20 15:22:06.937 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000c7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:22:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:07.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.090 226890 DEBUG oslo_concurrency.lockutils [None req-21f45e19-bf68-4c97-bfb4-a1c7ea64c42b cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.096 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.097 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3967MB free_disk=20.897098541259766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.097 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.098 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.366 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance e6540d15-a33f-4638-b102-8a1629193c18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.367 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.368 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.519 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.764 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.764 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.765 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.765 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.765 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.767 226890 INFO nova.compute.manager [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Terminating instance#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.768 226890 DEBUG nova.compute.manager [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:22:07 np0005588920 kernel: tap31dd33a4-89 (unregistering): left promiscuous mode
Jan 20 10:22:07 np0005588920 NetworkManager[49076]: <info>  [1768922527.8249] device (tap31dd33a4-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:22:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:22:07Z|00935|binding|INFO|Releasing lport 31dd33a4-8964-40e6-9bcf-0219c62004df from this chassis (sb_readonly=0)
Jan 20 10:22:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:22:07Z|00936|binding|INFO|Setting lport 31dd33a4-8964-40e6-9bcf-0219c62004df down in Southbound
Jan 20 10:22:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:22:07Z|00937|binding|INFO|Removing iface tap31dd33a4-89 ovn-installed in OVS
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.838 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:07 np0005588920 nova_compute[226886]: 2026-01-20 15:22:07.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:07 np0005588920 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000c7.scope: Deactivated successfully.
Jan 20 10:22:07 np0005588920 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000c7.scope: Consumed 14.107s CPU time.
Jan 20 10:22:07 np0005588920 systemd-machined[196121]: Machine qemu-96-instance-000000c7 terminated.
Jan 20 10:22:07 np0005588920 podman[301303]: 2026-01-20 15:22:07.924223915 +0000 UTC m=+0.080149757 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 20 10:22:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3036926870' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.012 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.015 226890 INFO nova.virt.libvirt.driver [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Instance destroyed successfully.#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.015 226890 DEBUG nova.objects.instance [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'resources' on Instance uuid e6540d15-a33f-4638-b102-8a1629193c18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.019 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.190 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:87:c3 10.100.0.7'], port_security=['fa:16:3e:6c:87:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e6540d15-a33f-4638-b102-8a1629193c18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcc40ab-0e87-4bcf-a965-9808ad5ff106', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=31dd33a4-8964-40e6-9bcf-0219c62004df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.192 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 31dd33a4-8964-40e6-9bcf-0219c62004df in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a unbound from our chassis#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.192 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.194 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[089aafcf-35c4-4a1b-bb99-def44509d6d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.194 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace which is not needed anymore#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.244 226890 DEBUG nova.virt.libvirt.vif [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1412753994',display_name='tempest-AttachVolumeNegativeTest-server-1412753994',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1412753994',id=199,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJSD0uIE/eeiZTZtRE8MuMreHdWlysBXNmTJ+kR1VbAGP7zrBwpR9A0gwobx0kS0+2sZH/C0UGF0TEjCQgfGD7Qj2a/ny88m5Z02zvhjR09YJ67bFDf6iKxXKjBBzwCZZg==',key_name='tempest-keypair-1535435627',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:21:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-utp8rb16',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:21:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=e6540d15-a33f-4638-b102-8a1629193c18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.244 226890 DEBUG nova.network.os_vif_util [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "31dd33a4-8964-40e6-9bcf-0219c62004df", "address": "fa:16:3e:6c:87:c3", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31dd33a4-89", "ovs_interfaceid": "31dd33a4-8964-40e6-9bcf-0219c62004df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.245 226890 DEBUG nova.network.os_vif_util [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.245 226890 DEBUG os_vif [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.248 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.248 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31dd33a4-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.251 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.259 226890 INFO os_vif [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:87:c3,bridge_name='br-int',has_traffic_filtering=True,id=31dd33a4-8964-40e6-9bcf-0219c62004df,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31dd33a4-89')#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.290 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.290 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [NOTICE]   (301120) : haproxy version is 2.8.14-c23fe91
Jan 20 10:22:08 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [NOTICE]   (301120) : path to executable is /usr/sbin/haproxy
Jan 20 10:22:08 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [WARNING]  (301120) : Exiting Master process...
Jan 20 10:22:08 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [ALERT]    (301120) : Current worker (301122) exited with code 143 (Terminated)
Jan 20 10:22:08 np0005588920 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[301116]: [WARNING]  (301120) : All workers exited. Exiting... (0)
Jan 20 10:22:08 np0005588920 systemd[1]: libpod-c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9.scope: Deactivated successfully.
Jan 20 10:22:08 np0005588920 podman[301373]: 2026-01-20 15:22:08.325541818 +0000 UTC m=+0.043584202 container died c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:22:08 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9-userdata-shm.mount: Deactivated successfully.
Jan 20 10:22:08 np0005588920 systemd[1]: var-lib-containers-storage-overlay-704aa8525b3c16f610c16236a7283a7ee80b019472f2abfe39ea6ea37db1f770-merged.mount: Deactivated successfully.
Jan 20 10:22:08 np0005588920 podman[301373]: 2026-01-20 15:22:08.36203317 +0000 UTC m=+0.080075564 container cleanup c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:22:08 np0005588920 systemd[1]: libpod-conmon-c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9.scope: Deactivated successfully.
Jan 20 10:22:08 np0005588920 podman[301416]: 2026-01-20 15:22:08.419649708 +0000 UTC m=+0.036923404 container remove c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.427 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b1fe5997-65c4-451e-bb24-282ffcca2d84]: (4, ('Tue Jan 20 03:22:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9)\nc1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9\nTue Jan 20 03:22:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (c1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9)\nc1388818b385b87a3b0742ce6077d4cac6f28e5062acf2551d26438e52e54ea9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.429 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a854adec-4f67-4e6e-85b4-f8f332323e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.430 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:08 np0005588920 kernel: tap76c2d716-70: left promiscuous mode
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.431 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.444 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.446 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bb533a42-6e28-468f-8a9e-d238cbb9d47b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.459 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[92f30e18-0e56-40ee-b233-14bdc7bfe520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.460 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2819e1-abc0-48d3-8612-aaff940db5cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.475 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d1bfd352-9941-4f2c-b385-8ba56fa0cd47]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752775, 'reachable_time': 25565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301432, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.478 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:22:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:08.478 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[055b4806-2ab9-47c0-93f1-f821f3953135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:08 np0005588920 systemd[1]: run-netns-ovnmeta\x2d76c2d716\x2d7d14\x2d4bc1\x2db83b\x2da3290ee99d9a.mount: Deactivated successfully.
Jan 20 10:22:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.614 226890 INFO nova.virt.libvirt.driver [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Deleting instance files /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18_del#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.615 226890 INFO nova.virt.libvirt.driver [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Deletion of /var/lib/nova/instances/e6540d15-a33f-4638-b102-8a1629193c18_del complete#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.674 226890 INFO nova.compute.manager [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.675 226890 DEBUG oslo.service.loopingcall [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.675 226890 DEBUG nova.compute.manager [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:22:08 np0005588920 nova_compute[226886]: 2026-01-20 15:22:08.675 226890 DEBUG nova.network.neutron [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:22:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:09.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:09.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.757 226890 DEBUG nova.compute.manager [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-unplugged-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.757 226890 DEBUG oslo_concurrency.lockutils [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.758 226890 DEBUG oslo_concurrency.lockutils [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.759 226890 DEBUG oslo_concurrency.lockutils [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.759 226890 DEBUG nova.compute.manager [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] No waiting events found dispatching network-vif-unplugged-31dd33a4-8964-40e6-9bcf-0219c62004df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:09 np0005588920 nova_compute[226886]: 2026-01-20 15:22:09.759 226890 DEBUG nova.compute.manager [req-3eb42224-c8da-4211-a64c-359672fc27c7 req-8e299d01-7fbd-400e-9c7e-cbe7af31aeef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-unplugged-31dd33a4-8964-40e6-9bcf-0219c62004df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:22:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:10.593 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:22:10 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:10.594 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.623 226890 DEBUG nova.network.neutron [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.641 226890 INFO nova.compute.manager [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Took 1.97 seconds to deallocate network for instance.#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.687 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.688 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.701 226890 DEBUG nova.compute.manager [req-cd89c092-7c30-40cf-80fd-f772239817fa req-ae3efc23-57bd-4bad-8d80-a63d2d68ed98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-deleted-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:10 np0005588920 nova_compute[226886]: 2026-01-20 15:22:10.729 226890 DEBUG oslo_concurrency.processutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 20 10:22:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:11.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/971058922' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.168 226890 DEBUG oslo_concurrency.processutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.173 226890 DEBUG nova.compute.provider_tree [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.195 226890 DEBUG nova.scheduler.client.report [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.221 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.245 226890 INFO nova.scheduler.client.report [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Deleted allocations for instance e6540d15-a33f-4638-b102-8a1629193c18#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.313 226890 DEBUG oslo_concurrency.lockutils [None req-4786df9f-a52f-41ba-96eb-6572963dd3a0 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.859 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.860 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.873 226890 DEBUG nova.compute.manager [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.873 226890 DEBUG oslo_concurrency.lockutils [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e6540d15-a33f-4638-b102-8a1629193c18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.874 226890 DEBUG oslo_concurrency.lockutils [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.874 226890 DEBUG oslo_concurrency.lockutils [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e6540d15-a33f-4638-b102-8a1629193c18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.874 226890 DEBUG nova.compute.manager [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] No waiting events found dispatching network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:11 np0005588920 nova_compute[226886]: 2026-01-20 15:22:11.874 226890 WARNING nova.compute.manager [req-83d15537-f69d-4a84-a399-31d12643fcb2 req-f0d7b811-b673-468d-8ff3-70b28c4b27c9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Received unexpected event network-vif-plugged-31dd33a4-8964-40e6-9bcf-0219c62004df for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:22:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:13.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:13 np0005588920 nova_compute[226886]: 2026-01-20 15:22:13.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:13 np0005588920 nova_compute[226886]: 2026-01-20 15:22:13.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:13 np0005588920 nova_compute[226886]: 2026-01-20 15:22:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:13 np0005588920 nova_compute[226886]: 2026-01-20 15:22:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:22:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2030652367' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:22:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:22:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2030652367' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:22:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:14.595 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:14 np0005588920 nova_compute[226886]: 2026-01-20 15:22:14.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:14 np0005588920 nova_compute[226886]: 2026-01-20 15:22:14.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:22:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:15 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:16.483 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:22:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:16 np0005588920 nova_compute[226886]: 2026-01-20 15:22:16.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:16 np0005588920 nova_compute[226886]: 2026-01-20 15:22:16.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:22:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:22:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:17 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:17.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:17.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:18 np0005588920 nova_compute[226886]: 2026-01-20 15:22:18.262 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:18 np0005588920 nova_compute[226886]: 2026-01-20 15:22:18.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:18 np0005588920 podman[301455]: 2026-01-20 15:22:18.966564599 +0000 UTC m=+0.045138806 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 20 10:22:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:19.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:19.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:21.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:22 np0005588920 nova_compute[226886]: 2026-01-20 15:22:22.430 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:22 np0005588920 nova_compute[226886]: 2026-01-20 15:22:22.509 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 20 10:22:23 np0005588920 nova_compute[226886]: 2026-01-20 15:22:23.010 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922528.008351, e6540d15-a33f-4638-b102-8a1629193c18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:22:23 np0005588920 nova_compute[226886]: 2026-01-20 15:22:23.010 226890 INFO nova.compute.manager [-] [instance: e6540d15-a33f-4638-b102-8a1629193c18] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:22:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:23.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:23.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:23 np0005588920 nova_compute[226886]: 2026-01-20 15:22:23.034 226890 DEBUG nova.compute.manager [None req-89363d5f-e21e-4169-959c-44de12c30f73 - - - - - -] [instance: e6540d15-a33f-4638-b102-8a1629193c18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:22:23 np0005588920 nova_compute[226886]: 2026-01-20 15:22:23.265 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:23 np0005588920 nova_compute[226886]: 2026-01-20 15:22:23.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:27.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:28 np0005588920 nova_compute[226886]: 2026-01-20 15:22:28.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:28 np0005588920 nova_compute[226886]: 2026-01-20 15:22:28.307 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:29.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:29.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:31.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:31.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:33.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:33 np0005588920 nova_compute[226886]: 2026-01-20 15:22:33.268 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:33 np0005588920 nova_compute[226886]: 2026-01-20 15:22:33.309 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:33 np0005588920 podman[301648]: 2026-01-20 15:22:33.409565224 +0000 UTC m=+0.055518181 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:22:33 np0005588920 podman[301648]: 2026-01-20 15:22:33.528556166 +0000 UTC m=+0.174509103 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:22:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:35.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:22:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:37.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:38 np0005588920 nova_compute[226886]: 2026-01-20 15:22:38.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:38 np0005588920 nova_compute[226886]: 2026-01-20 15:22:38.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:39 np0005588920 podman[301898]: 2026-01-20 15:22:39.009611954 +0000 UTC m=+0.086551327 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:22:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:41.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:43.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:43.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:43 np0005588920 nova_compute[226886]: 2026-01-20 15:22:43.272 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:43 np0005588920 nova_compute[226886]: 2026-01-20 15:22:43.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:45.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:45.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:47.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:47.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:48 np0005588920 nova_compute[226886]: 2026-01-20 15:22:48.275 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:48 np0005588920 nova_compute[226886]: 2026-01-20 15:22:48.314 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:49.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:49.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:49 np0005588920 podman[301977]: 2026-01-20 15:22:49.977133242 +0000 UTC m=+0.050620381 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:22:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:51.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:53.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:53.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:53 np0005588920 nova_compute[226886]: 2026-01-20 15:22:53.277 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:53 np0005588920 nova_compute[226886]: 2026-01-20 15:22:53.318 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:22:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:55.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:55 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:55.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:22:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:57 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:58 np0005588920 nova_compute[226886]: 2026-01-20 15:22:58.280 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:58 np0005588920 nova_compute[226886]: 2026-01-20 15:22:58.321 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:22:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:01.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:03 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:23:03 np0005588920 nova_compute[226886]: 2026-01-20 15:23:03.739 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:23:04 np0005588920 nova_compute[226886]: 2026-01-20 15:23:04.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:05.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:05 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:05.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.292 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.293 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.293 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.293 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.293 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:05 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:23:05 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/147687748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.786 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.949 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.950 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4149MB free_disk=20.96752166748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.950 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:05 np0005588920 nova_compute[226886]: 2026-01-20 15:23:05.951 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.010 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.010 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.026 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:23:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2287325465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.482 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.487 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.501 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.525 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:23:06 np0005588920 nova_compute[226886]: 2026-01-20 15:23:06.526 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:07.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:07 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:07.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:08 np0005588920 nova_compute[226886]: 2026-01-20 15:23:08.283 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:08 np0005588920 nova_compute[226886]: 2026-01-20 15:23:08.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:09.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:09 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:09.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:10 np0005588920 podman[302041]: 2026-01-20 15:23:10.003808728 +0000 UTC m=+0.091101206 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:23:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:11.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:11 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:11.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:11.201 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:11 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:11.202 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:23:11 np0005588920 nova_compute[226886]: 2026-01-20 15:23:11.203 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:11 np0005588920 nova_compute[226886]: 2026-01-20 15:23:11.526 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:13.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:13 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:13.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:13 np0005588920 nova_compute[226886]: 2026-01-20 15:23:13.286 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:13 np0005588920 nova_compute[226886]: 2026-01-20 15:23:13.325 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:13 np0005588920 nova_compute[226886]: 2026-01-20 15:23:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:14 np0005588920 nova_compute[226886]: 2026-01-20 15:23:14.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:15.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:15.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:15 np0005588920 nova_compute[226886]: 2026-01-20 15:23:15.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:15 np0005588920 nova_compute[226886]: 2026-01-20 15:23:15.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:15 np0005588920 nova_compute[226886]: 2026-01-20 15:23:15.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:16.484 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:17 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:17.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:17.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:17 np0005588920 ovn_controller[133971]: 2026-01-20T15:23:17Z|00938|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 20 10:23:18 np0005588920 nova_compute[226886]: 2026-01-20 15:23:18.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:18 np0005588920 nova_compute[226886]: 2026-01-20 15:23:18.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:18 np0005588920 nova_compute[226886]: 2026-01-20 15:23:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:18 np0005588920 nova_compute[226886]: 2026-01-20 15:23:18.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:23:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:19.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:19 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:19.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:19.204 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:19 np0005588920 nova_compute[226886]: 2026-01-20 15:23:19.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:20 np0005588920 podman[302071]: 2026-01-20 15:23:20.961116609 +0000 UTC m=+0.053353578 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 10:23:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:21 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:21.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:23.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:23 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:23.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:23 np0005588920 nova_compute[226886]: 2026-01-20 15:23:23.289 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:23 np0005588920 nova_compute[226886]: 2026-01-20 15:23:23.330 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:25 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:25.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:25.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:27.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:27 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:27.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:28 np0005588920 nova_compute[226886]: 2026-01-20 15:23:28.292 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588920 nova_compute[226886]: 2026-01-20 15:23:28.331 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:29.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:29.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.525780) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610525879, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 1817, "num_deletes": 252, "total_data_size": 4124514, "memory_usage": 4182024, "flush_reason": "Manual Compaction"}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610556399, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 2698868, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 72366, "largest_seqno": 74178, "table_properties": {"data_size": 2691378, "index_size": 4368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16277, "raw_average_key_size": 20, "raw_value_size": 2676151, "raw_average_value_size": 3361, "num_data_blocks": 190, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922462, "oldest_key_time": 1768922462, "file_creation_time": 1768922610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 30753 microseconds, and 7140 cpu microseconds.
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.556539) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 2698868 bytes OK
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.556586) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.559244) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.559259) EVENT_LOG_v1 {"time_micros": 1768922610559255, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.559278) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 4116259, prev total WAL file size 4116259, number of live WAL files 2.
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.560821) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(2635KB)], [147(10MB)]
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610560909, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 13196557, "oldest_snapshot_seqno": -1}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9669 keys, 11330181 bytes, temperature: kUnknown
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610680910, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 11330181, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11268991, "index_size": 35925, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24197, "raw_key_size": 254448, "raw_average_key_size": 26, "raw_value_size": 11100622, "raw_average_value_size": 1148, "num_data_blocks": 1363, "num_entries": 9669, "num_filter_entries": 9669, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.681181) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 11330181 bytes
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.713005) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.9 rd, 94.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 10.0 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(9.1) write-amplify(4.2) OK, records in: 10194, records dropped: 525 output_compression: NoCompression
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.713038) EVENT_LOG_v1 {"time_micros": 1768922610713026, "job": 94, "event": "compaction_finished", "compaction_time_micros": 120079, "compaction_time_cpu_micros": 26380, "output_level": 6, "num_output_files": 1, "total_output_size": 11330181, "num_input_records": 10194, "num_output_records": 9669, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610713779, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610715574, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.560677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.715611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.715615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.715616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.715617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:23:30.715619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:31 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:31.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:31.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:33.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:33 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:33.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:33 np0005588920 nova_compute[226886]: 2026-01-20 15:23:33.294 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588920 nova_compute[226886]: 2026-01-20 15:23:33.333 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:35.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:23:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:35 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:35.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:37.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:37.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:38 np0005588920 nova_compute[226886]: 2026-01-20 15:23:38.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:38 np0005588920 nova_compute[226886]: 2026-01-20 15:23:38.334 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:39.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:39.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:40 np0005588920 podman[302092]: 2026-01-20 15:23:40.988966417 +0000 UTC m=+0.077411259 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:23:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:41.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:41.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:23:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:43 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:23:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:43.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:43.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:43 np0005588920 nova_compute[226886]: 2026-01-20 15:23:43.298 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:43 np0005588920 nova_compute[226886]: 2026-01-20 15:23:43.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:45.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:45.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:47.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:47.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:48 np0005588920 nova_compute[226886]: 2026-01-20 15:23:48.300 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:48 np0005588920 nova_compute[226886]: 2026-01-20 15:23:48.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:49.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:49.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:51.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:51.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:51 np0005588920 podman[302299]: 2026-01-20 15:23:51.959701116 +0000 UTC m=+0.049631834 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 20 10:23:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:53.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:53.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:53 np0005588920 nova_compute[226886]: 2026-01-20 15:23:53.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:53 np0005588920 nova_compute[226886]: 2026-01-20 15:23:53.339 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:55.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:55.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:56.155 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:56.156 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:23:56 np0005588920 nova_compute[226886]: 2026-01-20 15:23:56.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:56 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:23:56.157 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:57.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:57.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:58 np0005588920 nova_compute[226886]: 2026-01-20 15:23:58.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:58 np0005588920 nova_compute[226886]: 2026-01-20 15:23:58.341 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:59.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:23:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:59.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:01.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:01.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:03.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:03.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:24:03 np0005588920 nova_compute[226886]: 2026-01-20 15:24:03.742 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:24:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:05.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:05.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.766 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.766 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.766 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.766 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:24:05 np0005588920 nova_compute[226886]: 2026-01-20 15:24:05.767 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:24:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:24:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/861187482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.218 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.375 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.376 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4156MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.377 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.377 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.487 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.487 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.502 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.525 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.525 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.541 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.565 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:24:06 np0005588920 nova_compute[226886]: 2026-01-20 15:24:06.582 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:24:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:24:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/95525792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:24:07 np0005588920 nova_compute[226886]: 2026-01-20 15:24:07.009 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:24:07 np0005588920 nova_compute[226886]: 2026-01-20 15:24:07.016 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:24:07 np0005588920 nova_compute[226886]: 2026-01-20 15:24:07.035 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:24:07 np0005588920 nova_compute[226886]: 2026-01-20 15:24:07.037 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:24:07 np0005588920 nova_compute[226886]: 2026-01-20 15:24:07.037 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:07.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:07.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:08 np0005588920 nova_compute[226886]: 2026-01-20 15:24:08.308 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:08 np0005588920 nova_compute[226886]: 2026-01-20 15:24:08.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:09.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:09.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:11.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:11.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:11 np0005588920 podman[302363]: 2026-01-20 15:24:11.996168908 +0000 UTC m=+0.089252003 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:24:12 np0005588920 nova_compute[226886]: 2026-01-20 15:24:12.038 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:13.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:13.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:13 np0005588920 nova_compute[226886]: 2026-01-20 15:24:13.312 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:13 np0005588920 nova_compute[226886]: 2026-01-20 15:24:13.347 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:24:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2614624376' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:24:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:24:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2614624376' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:24:14 np0005588920 nova_compute[226886]: 2026-01-20 15:24:14.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:15.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:15.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:15 np0005588920 nova_compute[226886]: 2026-01-20 15:24:15.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:15 np0005588920 nova_compute[226886]: 2026-01-20 15:24:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:24:16.485 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:24:16.485 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:24:16.485 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:16 np0005588920 nova_compute[226886]: 2026-01-20 15:24:16.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:17.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:17 np0005588920 nova_compute[226886]: 2026-01-20 15:24:17.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:18 np0005588920 nova_compute[226886]: 2026-01-20 15:24:18.314 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:18 np0005588920 nova_compute[226886]: 2026-01-20 15:24:18.350 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:19.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:19 np0005588920 nova_compute[226886]: 2026-01-20 15:24:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:19 np0005588920 nova_compute[226886]: 2026-01-20 15:24:19.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:24:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:21.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:23 np0005588920 podman[302394]: 2026-01-20 15:24:23.005135938 +0000 UTC m=+0.086636970 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:24:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:23.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:23.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:23 np0005588920 nova_compute[226886]: 2026-01-20 15:24:23.315 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:23 np0005588920 nova_compute[226886]: 2026-01-20 15:24:23.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:23 np0005588920 nova_compute[226886]: 2026-01-20 15:24:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:25.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:25.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:27.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:27.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:28 np0005588920 nova_compute[226886]: 2026-01-20 15:24:28.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:28 np0005588920 nova_compute[226886]: 2026-01-20 15:24:28.354 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:29.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:29.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:31.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:31.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:33.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:33.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:33 np0005588920 nova_compute[226886]: 2026-01-20 15:24:33.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:33 np0005588920 nova_compute[226886]: 2026-01-20 15:24:33.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:33 np0005588920 nova_compute[226886]: 2026-01-20 15:24:33.738 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:33 np0005588920 nova_compute[226886]: 2026-01-20 15:24:33.739 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:24:33 np0005588920 nova_compute[226886]: 2026-01-20 15:24:33.762 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:24:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:35.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:35.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:35 np0005588920 nova_compute[226886]: 2026-01-20 15:24:35.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:35 np0005588920 nova_compute[226886]: 2026-01-20 15:24:35.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:24:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:37.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:37.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:38 np0005588920 nova_compute[226886]: 2026-01-20 15:24:38.322 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:38 np0005588920 nova_compute[226886]: 2026-01-20 15:24:38.359 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:39.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:39.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:41.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:41.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:43 np0005588920 podman[302415]: 2026-01-20 15:24:43.000705313 +0000 UTC m=+0.080860476 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 20 10:24:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:43.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:43.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:43 np0005588920 nova_compute[226886]: 2026-01-20 15:24:43.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:43 np0005588920 nova_compute[226886]: 2026-01-20 15:24:43.360 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:45.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:45.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:45 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Jan 20 10:24:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:47.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:47.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:48 np0005588920 nova_compute[226886]: 2026-01-20 15:24:48.325 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:48 np0005588920 nova_compute[226886]: 2026-01-20 15:24:48.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:49.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:51.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:51.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:24:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:24:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:53.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:53.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:53 np0005588920 nova_compute[226886]: 2026-01-20 15:24:53.328 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:53 np0005588920 nova_compute[226886]: 2026-01-20 15:24:53.365 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:53 np0005588920 podman[302574]: 2026-01-20 15:24:53.964288222 +0000 UTC m=+0.053523804 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 10:24:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:55.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:55.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:57.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:58 np0005588920 nova_compute[226886]: 2026-01-20 15:24:58.331 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:58 np0005588920 nova_compute[226886]: 2026-01-20 15:24:58.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:59.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:24:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:59.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:01.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:01.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:03.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:03.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:03 np0005588920 nova_compute[226886]: 2026-01-20 15:25:03.332 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:03.391 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:25:03 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:03.391 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:25:03 np0005588920 nova_compute[226886]: 2026-01-20 15:25:03.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:05.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:05.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:05 np0005588920 nova_compute[226886]: 2026-01-20 15:25:05.742 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:05 np0005588920 nova_compute[226886]: 2026-01-20 15:25:05.743 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:25:05 np0005588920 nova_compute[226886]: 2026-01-20 15:25:05.743 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:25:05 np0005588920 nova_compute[226886]: 2026-01-20 15:25:05.760 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.757 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.758 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.758 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.758 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:25:06 np0005588920 nova_compute[226886]: 2026-01-20 15:25:06.759 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:25:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1643207792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.229 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:07.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:07.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.380 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.381 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4168MB free_disk=20.977684020996094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.381 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.381 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.444 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.445 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.473 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:25:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/216630041' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.896 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:07 np0005588920 nova_compute[226886]: 2026-01-20 15:25:07.901 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:25:08 np0005588920 nova_compute[226886]: 2026-01-20 15:25:08.141 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:25:08 np0005588920 nova_compute[226886]: 2026-01-20 15:25:08.142 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:25:08 np0005588920 nova_compute[226886]: 2026-01-20 15:25:08.143 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:08 np0005588920 nova_compute[226886]: 2026-01-20 15:25:08.335 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:08 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:08.393 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:08 np0005588920 nova_compute[226886]: 2026-01-20 15:25:08.418 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:09.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:09.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:11.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:11.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:12 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 20 10:25:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:12.992314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:25:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 20 10:25:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922712992352, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1229, "num_deletes": 256, "total_data_size": 2642663, "memory_usage": 2679472, "flush_reason": "Manual Compaction"}
Jan 20 10:25:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713007159, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1732615, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74183, "largest_seqno": 75407, "table_properties": {"data_size": 1727336, "index_size": 2738, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11445, "raw_average_key_size": 19, "raw_value_size": 1716618, "raw_average_value_size": 2919, "num_data_blocks": 122, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922611, "oldest_key_time": 1768922611, "file_creation_time": 1768922712, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 14916 microseconds, and 3997 cpu microseconds.
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.007228) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1732615 bytes OK
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.007243) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010673) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010685) EVENT_LOG_v1 {"time_micros": 1768922713010682, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010729) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2636780, prev total WAL file size 2653429, number of live WAL files 2.
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.011332) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373731' seq:72057594037927935, type:22 .. '6C6F676D0033303233' seq:0, type:0; will stop at (end)
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1692KB)], [150(10MB)]
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713011359, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 13062796, "oldest_snapshot_seqno": -1}
Jan 20 10:25:13 np0005588920 nova_compute[226886]: 2026-01-20 15:25:13.143 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9732 keys, 12927240 bytes, temperature: kUnknown
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713161339, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 12927240, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12863714, "index_size": 38069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24389, "raw_key_size": 256714, "raw_average_key_size": 26, "raw_value_size": 12692356, "raw_average_value_size": 1304, "num_data_blocks": 1454, "num_entries": 9732, "num_filter_entries": 9732, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922713, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.161620) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 12927240 bytes
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.163341) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.0 rd, 86.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.8 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(15.0) write-amplify(7.5) OK, records in: 10257, records dropped: 525 output_compression: NoCompression
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.163367) EVENT_LOG_v1 {"time_micros": 1768922713163355, "job": 96, "event": "compaction_finished", "compaction_time_micros": 150061, "compaction_time_cpu_micros": 30817, "output_level": 6, "num_output_files": 1, "total_output_size": 12927240, "num_input_records": 10257, "num_output_records": 9732, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713163807, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713166065, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.011271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.166187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.166240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.166244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.166247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:25:13.166250) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:13.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:13.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:13 np0005588920 nova_compute[226886]: 2026-01-20 15:25:13.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:13 np0005588920 nova_compute[226886]: 2026-01-20 15:25:13.420 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:13 np0005588920 podman[302689]: 2026-01-20 15:25:13.992368549 +0000 UTC m=+0.073028496 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:25:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:15.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:15.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:15 np0005588920 nova_compute[226886]: 2026-01-20 15:25:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:15 np0005588920 nova_compute[226886]: 2026-01-20 15:25:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:16.485 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:16.486 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:25:16.486 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:16 np0005588920 nova_compute[226886]: 2026-01-20 15:25:16.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:17.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:17.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:17 np0005588920 nova_compute[226886]: 2026-01-20 15:25:17.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:18 np0005588920 nova_compute[226886]: 2026-01-20 15:25:18.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:18 np0005588920 nova_compute[226886]: 2026-01-20 15:25:18.421 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:18 np0005588920 nova_compute[226886]: 2026-01-20 15:25:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:19.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:19.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:19 np0005588920 nova_compute[226886]: 2026-01-20 15:25:19.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:21.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:21.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:21 np0005588920 nova_compute[226886]: 2026-01-20 15:25:21.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:21 np0005588920 nova_compute[226886]: 2026-01-20 15:25:21.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:25:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:23.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:23.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:23 np0005588920 nova_compute[226886]: 2026-01-20 15:25:23.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:23 np0005588920 nova_compute[226886]: 2026-01-20 15:25:23.422 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:24 np0005588920 podman[302715]: 2026-01-20 15:25:24.95322903 +0000 UTC m=+0.043103530 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:25:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:25.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:25.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:27.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:27.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:28 np0005588920 nova_compute[226886]: 2026-01-20 15:25:28.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:28 np0005588920 nova_compute[226886]: 2026-01-20 15:25:28.423 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:29.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:29.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:31.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:31.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:33.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:33.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:33 np0005588920 nova_compute[226886]: 2026-01-20 15:25:33.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:33 np0005588920 nova_compute[226886]: 2026-01-20 15:25:33.424 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:25:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:35 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:35.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:35.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:25:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:37.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:37 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:37.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:38 np0005588920 nova_compute[226886]: 2026-01-20 15:25:38.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:38 np0005588920 nova_compute[226886]: 2026-01-20 15:25:38.425 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:39.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:39.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:41.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:41.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:43 np0005588920 nova_compute[226886]: 2026-01-20 15:25:43.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:43.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:43.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:43 np0005588920 nova_compute[226886]: 2026-01-20 15:25:43.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:44 np0005588920 podman[302735]: 2026-01-20 15:25:44.984982819 +0000 UTC m=+0.073453678 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:25:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:45.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:45.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:47.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:47.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:48 np0005588920 nova_compute[226886]: 2026-01-20 15:25:48.353 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:48 np0005588920 nova_compute[226886]: 2026-01-20 15:25:48.471 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:49.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:49.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:51.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:51.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:53 np0005588920 nova_compute[226886]: 2026-01-20 15:25:53.356 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:53.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:53.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:53 np0005588920 nova_compute[226886]: 2026-01-20 15:25:53.474 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:55.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:55.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:55 np0005588920 podman[302764]: 2026-01-20 15:25:55.963114838 +0000 UTC m=+0.054112871 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:25:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:57.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:57.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:58 np0005588920 nova_compute[226886]: 2026-01-20 15:25:58.358 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:58 np0005588920 nova_compute[226886]: 2026-01-20 15:25:58.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:25:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:25:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:25:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:59.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:25:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:59.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:01.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:01.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:03 np0005588920 nova_compute[226886]: 2026-01-20 15:26:03.360 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:03.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:03.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:03 np0005588920 nova_compute[226886]: 2026-01-20 15:26:03.479 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:26:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:26:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:05.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:05.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:05 np0005588920 nova_compute[226886]: 2026-01-20 15:26:05.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:05 np0005588920 nova_compute[226886]: 2026-01-20 15:26:05.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:26:05 np0005588920 nova_compute[226886]: 2026-01-20 15:26:05.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:26:05 np0005588920 nova_compute[226886]: 2026-01-20 15:26:05.744 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:26:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:07.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:07.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.748 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.748 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:26:07 np0005588920 nova_compute[226886]: 2026-01-20 15:26:07.749 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:26:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2608940846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.235 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.362 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.427 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.429 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4164MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.429 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.429 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.480 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.592 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.593 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:26:08 np0005588920 nova_compute[226886]: 2026-01-20 15:26:08.607 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:26:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1694225511' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:26:09 np0005588920 nova_compute[226886]: 2026-01-20 15:26:09.017 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:09 np0005588920 nova_compute[226886]: 2026-01-20 15:26:09.023 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:26:09 np0005588920 nova_compute[226886]: 2026-01-20 15:26:09.041 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:26:09 np0005588920 nova_compute[226886]: 2026-01-20 15:26:09.042 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:26:09 np0005588920 nova_compute[226886]: 2026-01-20 15:26:09.043 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:09.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:09.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:11.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:11.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:13 np0005588920 nova_compute[226886]: 2026-01-20 15:26:13.043 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:13 np0005588920 nova_compute[226886]: 2026-01-20 15:26:13.365 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:13.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:13.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:13 np0005588920 nova_compute[226886]: 2026-01-20 15:26:13.483 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:15.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:15.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:15 np0005588920 nova_compute[226886]: 2026-01-20 15:26:15.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:15 np0005588920 podman[303010]: 2026-01-20 15:26:15.992549868 +0000 UTC m=+0.079966630 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 10:26:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:16.487 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:16.488 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:16.488 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:16 np0005588920 nova_compute[226886]: 2026-01-20 15:26:16.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:17.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:17.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:17 np0005588920 nova_compute[226886]: 2026-01-20 15:26:17.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:18.275 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:26:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:18.277 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:26:18 np0005588920 nova_compute[226886]: 2026-01-20 15:26:18.276 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588920 nova_compute[226886]: 2026-01-20 15:26:18.366 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588920 nova_compute[226886]: 2026-01-20 15:26:18.484 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588920 nova_compute[226886]: 2026-01-20 15:26:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:19.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:19.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:19 np0005588920 nova_compute[226886]: 2026-01-20 15:26:19.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:21.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:21.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:22 np0005588920 nova_compute[226886]: 2026-01-20 15:26:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:22 np0005588920 nova_compute[226886]: 2026-01-20 15:26:22.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:26:23 np0005588920 nova_compute[226886]: 2026-01-20 15:26:23.367 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:23.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:23.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:23 np0005588920 nova_compute[226886]: 2026-01-20 15:26:23.485 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:25.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:25.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:26 np0005588920 podman[303036]: 2026-01-20 15:26:26.957019061 +0000 UTC m=+0.044955477 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:26:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:27.278 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:27.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:27.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:28 np0005588920 nova_compute[226886]: 2026-01-20 15:26:28.369 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:28 np0005588920 nova_compute[226886]: 2026-01-20 15:26:28.486 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:29.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:29.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:31.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:31.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:33 np0005588920 nova_compute[226886]: 2026-01-20 15:26:33.371 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:33.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:33.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:33 np0005588920 nova_compute[226886]: 2026-01-20 15:26:33.488 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:35.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:35.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:37.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:37.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:38 np0005588920 nova_compute[226886]: 2026-01-20 15:26:38.372 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:38 np0005588920 nova_compute[226886]: 2026-01-20 15:26:38.489 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:39.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:39.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:41.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:41.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.004 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.004 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.021 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.102 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.103 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.127 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.127 226890 INFO nova.compute.claims [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.310 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:42 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:26:42 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3507037067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.763 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.773 226890 DEBUG nova.compute.provider_tree [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.798 226890 DEBUG nova.scheduler.client.report [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.827 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.828 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.887 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.888 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.912 226890 INFO nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:26:42 np0005588920 nova_compute[226886]: 2026-01-20 15:26:42.937 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.113 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.115 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.116 226890 INFO nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Creating image(s)#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.151 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.187 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.219 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.225 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.308 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.309 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.310 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.310 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.346 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.353 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.392 226890 DEBUG nova.policy [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.395 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:43.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:43.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.491 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.752 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.836 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:26:43 np0005588920 nova_compute[226886]: 2026-01-20 15:26:43.952 226890 DEBUG nova.objects.instance [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:26:45 np0005588920 nova_compute[226886]: 2026-01-20 15:26:45.066 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:26:45 np0005588920 nova_compute[226886]: 2026-01-20 15:26:45.067 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Ensure instance console log exists: /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:26:45 np0005588920 nova_compute[226886]: 2026-01-20 15:26:45.068 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:45 np0005588920 nova_compute[226886]: 2026-01-20 15:26:45.068 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:45 np0005588920 nova_compute[226886]: 2026-01-20 15:26:45.068 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:45.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:45.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:47 np0005588920 podman[303244]: 2026-01-20 15:26:47.014711679 +0000 UTC m=+0.097848177 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:26:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:47.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:48 np0005588920 nova_compute[226886]: 2026-01-20 15:26:48.397 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:48 np0005588920 nova_compute[226886]: 2026-01-20 15:26:48.496 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:49.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:49.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:49 np0005588920 nova_compute[226886]: 2026-01-20 15:26:49.727 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Successfully created port: 37bd0485-2332-4502-a378-fe29d66faf07 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:26:50 np0005588920 nova_compute[226886]: 2026-01-20 15:26:50.824 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Successfully updated port: 37bd0485-2332-4502-a378-fe29d66faf07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:26:50 np0005588920 nova_compute[226886]: 2026-01-20 15:26:50.844 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:26:50 np0005588920 nova_compute[226886]: 2026-01-20 15:26:50.845 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:26:50 np0005588920 nova_compute[226886]: 2026-01-20 15:26:50.845 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:26:51 np0005588920 nova_compute[226886]: 2026-01-20 15:26:51.289 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:26:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:51.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:51.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:52 np0005588920 nova_compute[226886]: 2026-01-20 15:26:52.335 226890 DEBUG nova.network.neutron [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updating instance_info_cache with network_info: [{"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.258 226890 DEBUG nova.compute.manager [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-changed-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.259 226890 DEBUG nova.compute.manager [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Refreshing instance network info cache due to event network-changed-37bd0485-2332-4502-a378-fe29d66faf07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.259 226890 DEBUG oslo_concurrency.lockutils [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.298 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.298 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Instance network_info: |[{"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.299 226890 DEBUG oslo_concurrency.lockutils [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.299 226890 DEBUG nova.network.neutron [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Refreshing network info cache for port 37bd0485-2332-4502-a378-fe29d66faf07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.305 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Start _get_guest_xml network_info=[{"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.312 226890 WARNING nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.323 226890 DEBUG nova.virt.libvirt.host [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.324 226890 DEBUG nova.virt.libvirt.host [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.329 226890 DEBUG nova.virt.libvirt.host [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.329 226890 DEBUG nova.virt.libvirt.host [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.331 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.333 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.334 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.334 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.334 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.334 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.334 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.335 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.335 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.335 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.335 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.336 226890 DEBUG nova.virt.hardware [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.339 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.399 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:53.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.497 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:53.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:26:53 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1693400051' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.844 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.877 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:53 np0005588920 nova_compute[226886]: 2026-01-20 15:26:53.883 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:26:54 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/252485601' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.395 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.397 226890 DEBUG nova.virt.libvirt.vif [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:26:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-297427244',display_name='tempest-TestNetworkBasicOps-server-297427244',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-297427244',id=203,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImYL0mCTNXW9ftBLYa85Wvk+l0iQ+nYUVyh0yE4uBC6ByE8kDB1WkMLdWBDGHH4oB7LsNBDEEmeX4CksGmjggSnyUHHyEpNaCOGpF4SYQ2i//PkmIvcz6OlB4/jRmkfZw==',key_name='tempest-TestNetworkBasicOps-1921746038',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-3va01c3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:26:43Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=544c12e1-c0df-4fe7-b50c-a6cc7bc56a51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.398 226890 DEBUG nova.network.os_vif_util [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.399 226890 DEBUG nova.network.os_vif_util [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.401 226890 DEBUG nova.objects.instance [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.425 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <uuid>544c12e1-c0df-4fe7-b50c-a6cc7bc56a51</uuid>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <name>instance-000000cb</name>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkBasicOps-server-297427244</nova:name>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:26:53</nova:creationTime>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <nova:port uuid="37bd0485-2332-4502-a378-fe29d66faf07">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="serial">544c12e1-c0df-4fe7-b50c-a6cc7bc56a51</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="uuid">544c12e1-c0df-4fe7-b50c-a6cc7bc56a51</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:d9:38:5a"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <target dev="tap37bd0485-23"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/console.log" append="off"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:26:54 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:26:54 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:26:54 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:26:54 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.427 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Preparing to wait for external event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.428 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.428 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.428 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.429 226890 DEBUG nova.virt.libvirt.vif [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:26:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-297427244',display_name='tempest-TestNetworkBasicOps-server-297427244',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-297427244',id=203,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImYL0mCTNXW9ftBLYa85Wvk+l0iQ+nYUVyh0yE4uBC6ByE8kDB1WkMLdWBDGHH4oB7LsNBDEEmeX4CksGmjggSnyUHHyEpNaCOGpF4SYQ2i//PkmIvcz6OlB4/jRmkfZw==',key_name='tempest-TestNetworkBasicOps-1921746038',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-3va01c3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:26:43Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=544c12e1-c0df-4fe7-b50c-a6cc7bc56a51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.430 226890 DEBUG nova.network.os_vif_util [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.430 226890 DEBUG nova.network.os_vif_util [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.431 226890 DEBUG os_vif [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.431 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.432 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.432 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.436 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.436 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37bd0485-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.437 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37bd0485-23, col_values=(('external_ids', {'iface-id': '37bd0485-2332-4502-a378-fe29d66faf07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:38:5a', 'vm-uuid': '544c12e1-c0df-4fe7-b50c-a6cc7bc56a51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:54 np0005588920 NetworkManager[49076]: <info>  [1768922814.4395] manager: (tap37bd0485-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/438)
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.446 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.446 226890 INFO os_vif [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23')#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.670 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.671 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.671 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:d9:38:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.672 226890 INFO nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Using config drive#033[00m
Jan 20 10:26:54 np0005588920 nova_compute[226886]: 2026-01-20 15:26:54.692 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:55.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:55.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.415 226890 INFO nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Creating config drive at /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.420 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpts05qoqz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.557 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpts05qoqz" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.588 226890 DEBUG nova.storage.rbd_utils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.593 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.741 226890 DEBUG oslo_concurrency.processutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.742 226890 INFO nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Deleting local config drive /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51/disk.config because it was imported into RBD.#033[00m
Jan 20 10:26:56 np0005588920 kernel: tap37bd0485-23: entered promiscuous mode
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.816 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:56 np0005588920 ovn_controller[133971]: 2026-01-20T15:26:56Z|00939|binding|INFO|Claiming lport 37bd0485-2332-4502-a378-fe29d66faf07 for this chassis.
Jan 20 10:26:56 np0005588920 ovn_controller[133971]: 2026-01-20T15:26:56Z|00940|binding|INFO|37bd0485-2332-4502-a378-fe29d66faf07: Claiming fa:16:3e:d9:38:5a 10.100.0.28
Jan 20 10:26:56 np0005588920 NetworkManager[49076]: <info>  [1768922816.8179] manager: (tap37bd0485-23): new Tun device (/org/freedesktop/NetworkManager/Devices/439)
Jan 20 10:26:56 np0005588920 systemd-udevd[303404]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.852 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:56 np0005588920 systemd-machined[196121]: New machine qemu-97-instance-000000cb.
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.859 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:56 np0005588920 ovn_controller[133971]: 2026-01-20T15:26:56Z|00941|binding|INFO|Setting lport 37bd0485-2332-4502-a378-fe29d66faf07 ovn-installed in OVS
Jan 20 10:26:56 np0005588920 nova_compute[226886]: 2026-01-20 15:26:56.862 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:56 np0005588920 NetworkManager[49076]: <info>  [1768922816.8653] device (tap37bd0485-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:26:56 np0005588920 NetworkManager[49076]: <info>  [1768922816.8661] device (tap37bd0485-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:26:56 np0005588920 systemd[1]: Started Virtual Machine qemu-97-instance-000000cb.
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.246 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922817.2457595, 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.248 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] VM Started (Lifecycle Event)#033[00m
Jan 20 10:26:57 np0005588920 ovn_controller[133971]: 2026-01-20T15:26:57Z|00942|binding|INFO|Setting lport 37bd0485-2332-4502-a378-fe29d66faf07 up in Southbound
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.277 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:38:5a 10.100.0.28'], port_security=['fa:16:3e:d9:38:5a 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '544c12e1-c0df-4fe7-b50c-a6cc7bc56a51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81a13790-ad63-4a4c-b2ec-2002264761fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a020ed12-71c4-4c9b-893f-46ee2c801b03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbbbd1dd-5c1d-4952-9676-2ad19dc96404, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=37bd0485-2332-4502-a378-fe29d66faf07) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.278 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 37bd0485-2332-4502-a378-fe29d66faf07 in datapath 81a13790-ad63-4a4c-b2ec-2002264761fc bound to our chassis#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.279 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81a13790-ad63-4a4c-b2ec-2002264761fc#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.292 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eeba1d77-5d70-4c53-bb23-a3cdec9041eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.293 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81a13790-a1 in ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.295 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81a13790-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.296 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a76db107-0528-42c0-89c3-8be8519a5f0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.297 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[479d2797-f162-4cc9-944f-602643645581]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.309 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6c55fd8d-f225-4e12-b324-b5659b4d2dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.326 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9d6516-3b17-4b78-822d-67c49fe3af1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.347 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.352 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922817.2460084, 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.352 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.357 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[669eb634-4a77-410c-bd80-d9991fdc978d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 NetworkManager[49076]: <info>  [1768922817.3645] manager: (tap81a13790-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/440)
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.363 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4eec5f6c-9f3d-4695-b734-b484ffa67941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 systemd-udevd[303407]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.367 226890 DEBUG nova.network.neutron [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updated VIF entry in instance network info cache for port 37bd0485-2332-4502-a378-fe29d66faf07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.368 226890 DEBUG nova.network.neutron [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updating instance_info_cache with network_info: [{"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.379 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.384 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.394 226890 DEBUG oslo_concurrency.lockutils [req-e2b8c62c-e2b6-4b32-a429-608c2be0f851 req-881e793c-fed8-47e9-8996-dfac4a97df6c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.396 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e24cfc57-35e5-4c7e-9bab-304c60ccd03e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.400 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b847e7d5-4d03-42f5-922b-c1de52336540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.408 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:26:57 np0005588920 NetworkManager[49076]: <info>  [1768922817.4244] device (tap81a13790-a0): carrier: link connected
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.429 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d88d1c-a323-4f29-9b0c-d3eb85859020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 podman[303458]: 2026-01-20 15:26:57.431046727 +0000 UTC m=+0.089233603 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.445 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5b197d-edc9-45a9-88ed-89744c0513be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81a13790-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:1a:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 295], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 785911, 'reachable_time': 24023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303500, 'error': None, 'target': 'ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.458 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7f03f8-3044-44f2-ae43-ad81f5d32cea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:1add'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 785911, 'tstamp': 785911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303501, 'error': None, 'target': 'ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.475 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa6919f-d05f-4e7f-bf74-b97cdb9d76d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81a13790-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:1a:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 295], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 785911, 'reachable_time': 24023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 303502, 'error': None, 'target': 'ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:57.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.503 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a705274-d16a-4fb6-b4cb-37d3912b7977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:57.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.557 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc2f30d-b827-4dde-8028-dffd40525b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.558 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81a13790-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.559 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.559 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81a13790-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:57 np0005588920 kernel: tap81a13790-a0: entered promiscuous mode
Jan 20 10:26:57 np0005588920 NetworkManager[49076]: <info>  [1768922817.5628] manager: (tap81a13790-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/441)
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.563 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81a13790-a0, col_values=(('external_ids', {'iface-id': '43f52608-03fc-4a66-8415-e9b9829d078c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.564 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:57 np0005588920 ovn_controller[133971]: 2026-01-20T15:26:57Z|00943|binding|INFO|Releasing lport 43f52608-03fc-4a66-8415-e9b9829d078c from this chassis (sb_readonly=0)
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.576 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:57 np0005588920 nova_compute[226886]: 2026-01-20 15:26:57.576 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.577 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81a13790-ad63-4a4c-b2ec-2002264761fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81a13790-ad63-4a4c-b2ec-2002264761fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.578 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ed510960-fd11-4b01-a8a5-d8d61b84b9bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.579 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-81a13790-ad63-4a4c-b2ec-2002264761fc
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/81a13790-ad63-4a4c-b2ec-2002264761fc.pid.haproxy
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 81a13790-ad63-4a4c-b2ec-2002264761fc
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:26:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:26:57.579 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc', 'env', 'PROCESS_TAG=haproxy-81a13790-ad63-4a4c-b2ec-2002264761fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81a13790-ad63-4a4c-b2ec-2002264761fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:26:57 np0005588920 podman[303535]: 2026-01-20 15:26:57.915766072 +0000 UTC m=+0.046177992 container create 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 10:26:57 np0005588920 systemd[1]: Started libpod-conmon-72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d.scope.
Jan 20 10:26:57 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:26:57 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff6a786459fe06a498eb8f2c5190c27396da5494f60fe26c4969576902ee0d9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:26:57 np0005588920 podman[303535]: 2026-01-20 15:26:57.890246438 +0000 UTC m=+0.020658398 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:26:57 np0005588920 podman[303535]: 2026-01-20 15:26:57.99288772 +0000 UTC m=+0.123299680 container init 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 10:26:57 np0005588920 podman[303535]: 2026-01-20 15:26:57.998578942 +0000 UTC m=+0.128990872 container start 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:26:58 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [NOTICE]   (303555) : New worker (303557) forked
Jan 20 10:26:58 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [NOTICE]   (303555) : Loading success.
Jan 20 10:26:58 np0005588920 nova_compute[226886]: 2026-01-20 15:26:58.499 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.069 226890 DEBUG nova.compute.manager [req-b7aeff41-7841-4c46-8b61-617ed6d16cb5 req-80912721-8b2a-4b19-bd64-aca43f1b7f1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.069 226890 DEBUG oslo_concurrency.lockutils [req-b7aeff41-7841-4c46-8b61-617ed6d16cb5 req-80912721-8b2a-4b19-bd64-aca43f1b7f1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.070 226890 DEBUG oslo_concurrency.lockutils [req-b7aeff41-7841-4c46-8b61-617ed6d16cb5 req-80912721-8b2a-4b19-bd64-aca43f1b7f1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.070 226890 DEBUG oslo_concurrency.lockutils [req-b7aeff41-7841-4c46-8b61-617ed6d16cb5 req-80912721-8b2a-4b19-bd64-aca43f1b7f1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.070 226890 DEBUG nova.compute.manager [req-b7aeff41-7841-4c46-8b61-617ed6d16cb5 req-80912721-8b2a-4b19-bd64-aca43f1b7f1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Processing event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.071 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.076 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768922819.075817, 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.076 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.078 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.081 226890 INFO nova.virt.libvirt.driver [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Instance spawned successfully.#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.082 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.137 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.141 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.161 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.161 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.162 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.163 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.163 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.164 226890 DEBUG nova.virt.libvirt.driver [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.170 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.288 226890 INFO nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Took 16.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.289 226890 DEBUG nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.424 226890 INFO nova.compute.manager [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Took 17.35 seconds to build instance.#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:59 np0005588920 nova_compute[226886]: 2026-01-20 15:26:59.464 226890 DEBUG oslo_concurrency.lockutils [None req-7a34531e-ca84-47d0-b9f8-6d262d80bb45 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:59.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:26:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:59.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.225 226890 DEBUG nova.compute.manager [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.226 226890 DEBUG oslo_concurrency.lockutils [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.226 226890 DEBUG oslo_concurrency.lockutils [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.226 226890 DEBUG oslo_concurrency.lockutils [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.226 226890 DEBUG nova.compute.manager [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] No waiting events found dispatching network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:27:01 np0005588920 nova_compute[226886]: 2026-01-20 15:27:01.226 226890 WARNING nova.compute.manager [req-92983a41-6f3b-4d8f-9fc6-98f2b49a84ea req-ce9fd0e3-cbf6-4b9b-ba4c-751b169ec93f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received unexpected event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:27:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:01.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:01.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:03.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:03 np0005588920 nova_compute[226886]: 2026-01-20 15:27:03.500 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:03.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:04 np0005588920 nova_compute[226886]: 2026-01-20 15:27:04.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:27:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:27:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:05.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:05.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:07.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:07.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:07 np0005588920 nova_compute[226886]: 2026-01-20 15:27:07.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:07 np0005588920 nova_compute[226886]: 2026-01-20 15:27:07.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:27:07 np0005588920 nova_compute[226886]: 2026-01-20 15:27:07.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:27:07 np0005588920 NetworkManager[49076]: <info>  [1768922827.7531] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/442)
Jan 20 10:27:07 np0005588920 NetworkManager[49076]: <info>  [1768922827.7541] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/443)
Jan 20 10:27:07 np0005588920 nova_compute[226886]: 2026-01-20 15:27:07.756 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:07Z|00944|binding|INFO|Releasing lport 43f52608-03fc-4a66-8415-e9b9829d078c from this chassis (sb_readonly=0)
Jan 20 10:27:07 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:07Z|00945|binding|INFO|Releasing lport 43f52608-03fc-4a66-8415-e9b9829d078c from this chassis (sb_readonly=0)
Jan 20 10:27:07 np0005588920 nova_compute[226886]: 2026-01-20 15:27:07.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:08 np0005588920 nova_compute[226886]: 2026-01-20 15:27:08.286 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:27:08 np0005588920 nova_compute[226886]: 2026-01-20 15:27:08.286 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:27:08 np0005588920 nova_compute[226886]: 2026-01-20 15:27:08.287 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:27:08 np0005588920 nova_compute[226886]: 2026-01-20 15:27:08.287 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:27:08 np0005588920 nova_compute[226886]: 2026-01-20 15:27:08.502 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:09 np0005588920 nova_compute[226886]: 2026-01-20 15:27:09.444 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:09.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:09.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.347 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updating instance_info_cache with network_info: [{"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.376 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.376 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.376 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.402 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.403 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.403 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.404 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.404 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:10 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1964989034' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.906 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.990 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000cb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:27:10 np0005588920 nova_compute[226886]: 2026-01-20 15:27:10.991 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000cb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.162 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.163 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3972MB free_disk=20.92181396484375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.163 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.163 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.381 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.381 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.382 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:27:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:11.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:11.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:11 np0005588920 nova_compute[226886]: 2026-01-20 15:27:11.563 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3229525351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:12 np0005588920 nova_compute[226886]: 2026-01-20 15:27:12.014 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:12 np0005588920 nova_compute[226886]: 2026-01-20 15:27:12.019 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:27:12 np0005588920 nova_compute[226886]: 2026-01-20 15:27:12.048 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:27:12 np0005588920 nova_compute[226886]: 2026-01-20 15:27:12.073 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:27:12 np0005588920 nova_compute[226886]: 2026-01-20 15:27:12.073 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:12 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:12Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:38:5a 10.100.0.28
Jan 20 10:27:12 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:12Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:38:5a 10.100.0.28
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.906658) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832906747, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 1383, "num_deletes": 251, "total_data_size": 3165251, "memory_usage": 3223584, "flush_reason": "Manual Compaction"}
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832920996, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 2078124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75412, "largest_seqno": 76790, "table_properties": {"data_size": 2072208, "index_size": 3246, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12612, "raw_average_key_size": 19, "raw_value_size": 2060353, "raw_average_value_size": 3254, "num_data_blocks": 145, "num_entries": 633, "num_filter_entries": 633, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922713, "oldest_key_time": 1768922713, "file_creation_time": 1768922832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 14391 microseconds, and 5403 cpu microseconds.
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.921054) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 2078124 bytes OK
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.921077) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.922859) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.922876) EVENT_LOG_v1 {"time_micros": 1768922832922871, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.922892) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 3158775, prev total WAL file size 3158775, number of live WAL files 2.
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.923710) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(2029KB)], [153(12MB)]
Jan 20 10:27:12 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832923745, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 15005364, "oldest_snapshot_seqno": -1}
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9850 keys, 13117015 bytes, temperature: kUnknown
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833041439, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 13117015, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13052476, "index_size": 38826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24645, "raw_key_size": 259864, "raw_average_key_size": 26, "raw_value_size": 12878830, "raw_average_value_size": 1307, "num_data_blocks": 1480, "num_entries": 9850, "num_filter_entries": 9850, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.041732) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 13117015 bytes
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.044733) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.4 rd, 111.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.3 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(13.5) write-amplify(6.3) OK, records in: 10365, records dropped: 515 output_compression: NoCompression
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.044758) EVENT_LOG_v1 {"time_micros": 1768922833044748, "job": 98, "event": "compaction_finished", "compaction_time_micros": 117807, "compaction_time_cpu_micros": 31361, "output_level": 6, "num_output_files": 1, "total_output_size": 13117015, "num_input_records": 10365, "num_output_records": 9850, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833045527, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833048343, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:12.923636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.048393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.048396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.048398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.048399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:27:13.048401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:13.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:13 np0005588920 nova_compute[226886]: 2026-01-20 15:27:13.504 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:13.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/523948789' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/523948789' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:27:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:14 np0005588920 nova_compute[226886]: 2026-01-20 15:27:14.422 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:14 np0005588920 nova_compute[226886]: 2026-01-20 15:27:14.446 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:15.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:15.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:15 np0005588920 nova_compute[226886]: 2026-01-20 15:27:15.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:16.489 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:16.490 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:16.490 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:16 np0005588920 nova_compute[226886]: 2026-01-20 15:27:16.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:17.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:17.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:17 np0005588920 nova_compute[226886]: 2026-01-20 15:27:17.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:18 np0005588920 podman[303797]: 2026-01-20 15:27:18.040375088 +0000 UTC m=+0.122309062 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:27:18 np0005588920 nova_compute[226886]: 2026-01-20 15:27:18.508 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:19 np0005588920 nova_compute[226886]: 2026-01-20 15:27:19.449 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:19.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:19.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:19 np0005588920 nova_compute[226886]: 2026-01-20 15:27:19.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:20 np0005588920 nova_compute[226886]: 2026-01-20 15:27:20.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:21.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:21.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:21 np0005588920 nova_compute[226886]: 2026-01-20 15:27:21.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.808 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.808 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.809 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.809 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.809 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.811 226890 INFO nova.compute.manager [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Terminating instance#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.812 226890 DEBUG nova.compute.manager [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:27:22 np0005588920 kernel: tap37bd0485-23 (unregistering): left promiscuous mode
Jan 20 10:27:22 np0005588920 NetworkManager[49076]: <info>  [1768922842.8721] device (tap37bd0485-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:27:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:22Z|00946|binding|INFO|Releasing lport 37bd0485-2332-4502-a378-fe29d66faf07 from this chassis (sb_readonly=0)
Jan 20 10:27:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:22Z|00947|binding|INFO|Setting lport 37bd0485-2332-4502-a378-fe29d66faf07 down in Southbound
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.925 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:27:22Z|00948|binding|INFO|Removing iface tap37bd0485-23 ovn-installed in OVS
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:22.934 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:38:5a 10.100.0.28'], port_security=['fa:16:3e:d9:38:5a 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '544c12e1-c0df-4fe7-b50c-a6cc7bc56a51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81a13790-ad63-4a4c-b2ec-2002264761fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a020ed12-71c4-4c9b-893f-46ee2c801b03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbbbd1dd-5c1d-4952-9676-2ad19dc96404, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=37bd0485-2332-4502-a378-fe29d66faf07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:27:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:22.935 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 37bd0485-2332-4502-a378-fe29d66faf07 in datapath 81a13790-ad63-4a4c-b2ec-2002264761fc unbound from our chassis#033[00m
Jan 20 10:27:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:22.936 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81a13790-ad63-4a4c-b2ec-2002264761fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:27:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:22.938 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3c094a3d-0b3f-45cd-a14e-8a8a7161cd97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:22.939 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc namespace which is not needed anymore#033[00m
Jan 20 10:27:22 np0005588920 nova_compute[226886]: 2026-01-20 15:27:22.941 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:22 np0005588920 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cb.scope: Deactivated successfully.
Jan 20 10:27:22 np0005588920 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cb.scope: Consumed 13.472s CPU time.
Jan 20 10:27:22 np0005588920 systemd-machined[196121]: Machine qemu-97-instance-000000cb terminated.
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.048 226890 INFO nova.virt.libvirt.driver [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Instance destroyed successfully.#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.048 226890 DEBUG nova.objects.instance [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.074 226890 DEBUG nova.virt.libvirt.vif [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:26:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-297427244',display_name='tempest-TestNetworkBasicOps-server-297427244',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-297427244',id=203,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImYL0mCTNXW9ftBLYa85Wvk+l0iQ+nYUVyh0yE4uBC6ByE8kDB1WkMLdWBDGHH4oB7LsNBDEEmeX4CksGmjggSnyUHHyEpNaCOGpF4SYQ2i//PkmIvcz6OlB4/jRmkfZw==',key_name='tempest-TestNetworkBasicOps-1921746038',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:26:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-3va01c3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:26:59Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=544c12e1-c0df-4fe7-b50c-a6cc7bc56a51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.075 226890 DEBUG nova.network.os_vif_util [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "37bd0485-2332-4502-a378-fe29d66faf07", "address": "fa:16:3e:d9:38:5a", "network": {"id": "81a13790-ad63-4a4c-b2ec-2002264761fc", "bridge": "br-int", "label": "tempest-network-smoke--1617823073", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bd0485-23", "ovs_interfaceid": "37bd0485-2332-4502-a378-fe29d66faf07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.077 226890 DEBUG nova.network.os_vif_util [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.078 226890 DEBUG os_vif [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.081 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37bd0485-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.083 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.084 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.087 226890 INFO os_vif [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:38:5a,bridge_name='br-int',has_traffic_filtering=True,id=37bd0485-2332-4502-a378-fe29d66faf07,network=Network(81a13790-ad63-4a4c-b2ec-2002264761fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bd0485-23')#033[00m
Jan 20 10:27:23 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [NOTICE]   (303555) : haproxy version is 2.8.14-c23fe91
Jan 20 10:27:23 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [NOTICE]   (303555) : path to executable is /usr/sbin/haproxy
Jan 20 10:27:23 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [WARNING]  (303555) : Exiting Master process...
Jan 20 10:27:23 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [ALERT]    (303555) : Current worker (303557) exited with code 143 (Terminated)
Jan 20 10:27:23 np0005588920 neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc[303551]: [WARNING]  (303555) : All workers exited. Exiting... (0)
Jan 20 10:27:23 np0005588920 systemd[1]: libpod-72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d.scope: Deactivated successfully.
Jan 20 10:27:23 np0005588920 podman[303847]: 2026-01-20 15:27:23.141470475 +0000 UTC m=+0.117762583 container died 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:27:23 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d-userdata-shm.mount: Deactivated successfully.
Jan 20 10:27:23 np0005588920 systemd[1]: var-lib-containers-storage-overlay-ff6a786459fe06a498eb8f2c5190c27396da5494f60fe26c4969576902ee0d9b-merged.mount: Deactivated successfully.
Jan 20 10:27:23 np0005588920 podman[303847]: 2026-01-20 15:27:23.186585295 +0000 UTC m=+0.162877393 container cleanup 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:27:23 np0005588920 systemd[1]: libpod-conmon-72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d.scope: Deactivated successfully.
Jan 20 10:27:23 np0005588920 podman[303908]: 2026-01-20 15:27:23.298424488 +0000 UTC m=+0.087364200 container remove 72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.303 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f420d1c-3319-4cde-9828-73b2da767d78]: (4, ('Tue Jan 20 03:27:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc (72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d)\n72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d\nTue Jan 20 03:27:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc (72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d)\n72b0f2972fd0c539e6acb980f7d9d492e2bfdb32a3cf3002a264a4c1cb8aac9d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.307 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce56a20-13cb-4bc4-949b-b84216587d81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.308 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81a13790-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.310 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 kernel: tap81a13790-a0: left promiscuous mode
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.312 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.315 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[deb14a59-86c4-4cd1-ad67-006b0e77deaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.335 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff3848f-7ea7-42c4-8886-fd1386ddf9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.336 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[26c884f6-b8bf-4da1-b0ad-1c030f93345c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.353 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e60438fc-0c79-4620-b2d6-1bcee4986b03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 785903, 'reachable_time': 42227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303923, 'error': None, 'target': 'ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 systemd[1]: run-netns-ovnmeta\x2d81a13790\x2dad63\x2d4a4c\x2db2ec\x2d2002264761fc.mount: Deactivated successfully.
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.357 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81a13790-ad63-4a4c-b2ec-2002264761fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:27:23 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:23.357 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[9e638275-a9dc-474e-a288-1cc4c543af7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.510 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:23.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:23.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.598 226890 DEBUG nova.compute.manager [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-unplugged-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.598 226890 DEBUG oslo_concurrency.lockutils [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.598 226890 DEBUG oslo_concurrency.lockutils [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.599 226890 DEBUG oslo_concurrency.lockutils [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.599 226890 DEBUG nova.compute.manager [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] No waiting events found dispatching network-vif-unplugged-37bd0485-2332-4502-a378-fe29d66faf07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.599 226890 DEBUG nova.compute.manager [req-8c7b021a-9609-4329-ab8f-8796165bb372 req-adfcdd89-32c7-4b37-8256-1a59e18196c1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-unplugged-37bd0485-2332-4502-a378-fe29d66faf07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.718 226890 INFO nova.virt.libvirt.driver [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Deleting instance files /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_del#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.719 226890 INFO nova.virt.libvirt.driver [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Deletion of /var/lib/nova/instances/544c12e1-c0df-4fe7-b50c-a6cc7bc56a51_del complete#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:27:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.791 226890 INFO nova.compute.manager [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Took 0.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.791 226890 DEBUG oslo.service.loopingcall [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.791 226890 DEBUG nova.compute.manager [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:27:23 np0005588920 nova_compute[226886]: 2026-01-20 15:27:23.792 226890 DEBUG nova.network.neutron [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.815 226890 DEBUG nova.network.neutron [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.829 226890 INFO nova.compute.manager [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Took 1.04 seconds to deallocate network for instance.#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.885 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.885 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.948 226890 DEBUG nova.compute.manager [req-d4f7656b-e9ea-4e2b-a887-8828996ebaf6 req-1c6360e0-ecf0-4153-8c8b-2ae124dfb1eb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-deleted-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:24 np0005588920 nova_compute[226886]: 2026-01-20 15:27:24.997 226890 DEBUG oslo_concurrency.processutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3885040948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.473 226890 DEBUG oslo_concurrency.processutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.478 226890 DEBUG nova.compute.provider_tree [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.493 226890 DEBUG nova.scheduler.client.report [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:27:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:25.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.524 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:25.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.620 226890 INFO nova.scheduler.client.report [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.741 226890 DEBUG nova.compute.manager [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.742 226890 DEBUG oslo_concurrency.lockutils [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.742 226890 DEBUG oslo_concurrency.lockutils [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.742 226890 DEBUG oslo_concurrency.lockutils [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.742 226890 DEBUG nova.compute.manager [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] No waiting events found dispatching network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.743 226890 WARNING nova.compute.manager [req-e833f86b-2380-4ca2-ac8a-f55d5e92a525 req-f1751b60-da6e-43a3-a4b3-50ac5f76943c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Received unexpected event network-vif-plugged-37bd0485-2332-4502-a378-fe29d66faf07 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:27:25 np0005588920 nova_compute[226886]: 2026-01-20 15:27:25.787 226890 DEBUG oslo_concurrency.lockutils [None req-51cb304a-e1d5-4787-9702-06a4f0244180 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "544c12e1-c0df-4fe7-b50c-a6cc7bc56a51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:27.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:27.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:27 np0005588920 podman[303947]: 2026-01-20 15:27:27.954028773 +0000 UTC m=+0.043878097 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:27:28 np0005588920 nova_compute[226886]: 2026-01-20 15:27:28.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:28 np0005588920 nova_compute[226886]: 2026-01-20 15:27:28.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:29.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:29.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:30.090 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:27:30 np0005588920 nova_compute[226886]: 2026-01-20 15:27:30.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:30.091 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:27:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:31.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:31.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:27:32.093 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:33 np0005588920 nova_compute[226886]: 2026-01-20 15:27:33.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588920 nova_compute[226886]: 2026-01-20 15:27:33.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588920 nova_compute[226886]: 2026-01-20 15:27:33.441 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588920 nova_compute[226886]: 2026-01-20 15:27:33.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:33.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:33.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:35.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:35.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:37.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:37.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:38 np0005588920 nova_compute[226886]: 2026-01-20 15:27:38.047 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922843.0456681, 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:27:38 np0005588920 nova_compute[226886]: 2026-01-20 15:27:38.047 226890 INFO nova.compute.manager [-] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:27:38 np0005588920 nova_compute[226886]: 2026-01-20 15:27:38.071 226890 DEBUG nova.compute.manager [None req-acb7a491-ae11-4b07-8102-809a626234cc - - - - - -] [instance: 544c12e1-c0df-4fe7-b50c-a6cc7bc56a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:27:38 np0005588920 nova_compute[226886]: 2026-01-20 15:27:38.111 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:38 np0005588920 nova_compute[226886]: 2026-01-20 15:27:38.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:27:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:39 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:39.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:39.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:41.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:41.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:43 np0005588920 nova_compute[226886]: 2026-01-20 15:27:43.114 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:43 np0005588920 nova_compute[226886]: 2026-01-20 15:27:43.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:27:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:43.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:43 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:43.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:45.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:45.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:27:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:47.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:47 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:47.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:48 np0005588920 nova_compute[226886]: 2026-01-20 15:27:48.117 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:48 np0005588920 nova_compute[226886]: 2026-01-20 15:27:48.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:48 np0005588920 podman[303968]: 2026-01-20 15:27:48.987372215 +0000 UTC m=+0.081371580 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:27:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:27:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:49.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:49 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:49.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:27:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:51.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:51 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:51.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:27:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 15K writes, 77K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 15K writes, 15K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1508 writes, 7468 keys, 1508 commit groups, 1.0 writes per commit group, ingest: 15.29 MB, 0.03 MB/s#012Interval WAL: 1508 writes, 1508 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     83.1      1.13              0.35        49    0.023       0      0       0.0       0.0#012  L6      1/0   12.51 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.1    108.6     92.9      5.19              1.49        48    0.108    353K    26K       0.0       0.0#012 Sum      1/0   12.51 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.1     89.2     91.2      6.32              1.84        97    0.065    353K    26K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     88.4     90.0      0.91              0.25        12    0.076     60K   3110       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0    108.6     92.9      5.19              1.49        48    0.108    353K    26K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     83.2      1.13              0.35        48    0.023       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.092, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.56 GB write, 0.11 MB/s write, 0.55 GB read, 0.10 MB/s read, 6.3 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 62.48 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000374 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3576,59.92 MB,19.7111%) FilterBlock(97,977.48 KB,0.314005%) IndexBlock(97,1.60 MB,0.527126%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:27:53 np0005588920 nova_compute[226886]: 2026-01-20 15:27:53.120 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:53 np0005588920 nova_compute[226886]: 2026-01-20 15:27:53.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:53.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:53.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:55.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:55.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:57.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:57.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:58 np0005588920 nova_compute[226886]: 2026-01-20 15:27:58.123 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:58 np0005588920 nova_compute[226886]: 2026-01-20 15:27:58.524 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:58 np0005588920 podman[303996]: 2026-01-20 15:27:58.959100208 +0000 UTC m=+0.047735165 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Jan 20 10:27:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:59.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:27:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:59.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:01.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:01.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:03 np0005588920 nova_compute[226886]: 2026-01-20 15:28:03.128 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:03 np0005588920 nova_compute[226886]: 2026-01-20 15:28:03.525 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:03.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:03.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:05.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:05.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:07.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:07 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:07 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/185543081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:07.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.131 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.527 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.780 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.781 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.781 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.781 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:28:08 np0005588920 nova_compute[226886]: 2026-01-20 15:28:08.782 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/566218198' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.288 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.438 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.439 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4170MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.439 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.440 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.516 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.516 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.541 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:09.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:09.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1912177260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.990 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:09 np0005588920 nova_compute[226886]: 2026-01-20 15:28:09.997 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:28:10 np0005588920 nova_compute[226886]: 2026-01-20 15:28:10.014 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:28:10 np0005588920 nova_compute[226886]: 2026-01-20 15:28:10.040 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:28:10 np0005588920 nova_compute[226886]: 2026-01-20 15:28:10.040 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:11 np0005588920 nova_compute[226886]: 2026-01-20 15:28:11.040 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:11 np0005588920 nova_compute[226886]: 2026-01-20 15:28:11.041 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:28:11 np0005588920 nova_compute[226886]: 2026-01-20 15:28:11.041 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:28:11 np0005588920 nova_compute[226886]: 2026-01-20 15:28:11.065 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:28:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:11.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:11.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:28:13 np0005588920 nova_compute[226886]: 2026-01-20 15:28:13.135 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588920 nova_compute[226886]: 2026-01-20 15:28:13.565 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588920 nova_compute[226886]: 2026-01-20 15:28:13.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:13.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:13.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:15.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:15.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:16.490 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:16.491 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:16.491 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:16 np0005588920 nova_compute[226886]: 2026-01-20 15:28:16.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:16 np0005588920 nova_compute[226886]: 2026-01-20 15:28:16.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:17.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:17.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:18 np0005588920 nova_compute[226886]: 2026-01-20 15:28:18.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:18 np0005588920 nova_compute[226886]: 2026-01-20 15:28:18.567 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:19 np0005588920 nova_compute[226886]: 2026-01-20 15:28:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:19 np0005588920 nova_compute[226886]: 2026-01-20 15:28:19.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:19.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:19.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:20 np0005588920 podman[304244]: 2026-01-20 15:28:19.999797118 +0000 UTC m=+0.088168582 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 10:28:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:21.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:21.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:22 np0005588920 ovn_controller[133971]: 2026-01-20T15:28:22Z|00949|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 20 10:28:22 np0005588920 nova_compute[226886]: 2026-01-20 15:28:22.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:23 np0005588920 nova_compute[226886]: 2026-01-20 15:28:23.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:28:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 73K writes, 294K keys, 73K commit groups, 1.0 writes per commit group, ingest: 0.30 GB, 0.06 MB/s#012Cumulative WAL: 73K writes, 27K syncs, 2.70 writes per sync, written: 0.30 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4424 writes, 18K keys, 4424 commit groups, 1.0 writes per commit group, ingest: 21.42 MB, 0.04 MB/s#012Interval WAL: 4424 writes, 1655 syncs, 2.67 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:28:23 np0005588920 nova_compute[226886]: 2026-01-20 15:28:23.633 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:23.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:23.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:25 np0005588920 nova_compute[226886]: 2026-01-20 15:28:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:25 np0005588920 nova_compute[226886]: 2026-01-20 15:28:25.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:28:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:25.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:25.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:27.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:27.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:28 np0005588920 nova_compute[226886]: 2026-01-20 15:28:28.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:28 np0005588920 nova_compute[226886]: 2026-01-20 15:28:28.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:29.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:29.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:29 np0005588920 podman[304270]: 2026-01-20 15:28:29.954075385 +0000 UTC m=+0.045741769 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:28:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:31.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:31.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:32.527 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:32 np0005588920 nova_compute[226886]: 2026-01-20 15:28:32.527 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:32.528 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:28:33 np0005588920 nova_compute[226886]: 2026-01-20 15:28:33.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:33 np0005588920 nova_compute[226886]: 2026-01-20 15:28:33.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:33.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:33.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:28:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:35.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:35 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:35.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:28:36.530 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:28:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:37.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:37 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:37.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:38 np0005588920 nova_compute[226886]: 2026-01-20 15:28:38.155 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:38 np0005588920 nova_compute[226886]: 2026-01-20 15:28:38.639 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:39.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:39.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:41.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:41.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:43 np0005588920 nova_compute[226886]: 2026-01-20 15:28:43.159 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588920 nova_compute[226886]: 2026-01-20 15:28:43.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:43.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:43.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:45.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:45.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:47.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:47.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:48 np0005588920 nova_compute[226886]: 2026-01-20 15:28:48.161 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588920 nova_compute[226886]: 2026-01-20 15:28:48.641 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:49.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:49.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588920 podman[304291]: 2026-01-20 15:28:50.996928358 +0000 UTC m=+0.083635594 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 20 10:28:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:51.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:51.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:53 np0005588920 nova_compute[226886]: 2026-01-20 15:28:53.165 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:53 np0005588920 nova_compute[226886]: 2026-01-20 15:28:53.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:53.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:55.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:55.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 20 10:28:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:57.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 20 10:28:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:57.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:58 np0005588920 nova_compute[226886]: 2026-01-20 15:28:58.169 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:58 np0005588920 nova_compute[226886]: 2026-01-20 15:28:58.692 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:59.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:28:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:59.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:00 np0005588920 podman[304319]: 2026-01-20 15:29:00.954139438 +0000 UTC m=+0.047545440 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:29:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:01.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:01.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:03 np0005588920 nova_compute[226886]: 2026-01-20 15:29:03.173 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:03 np0005588920 nova_compute[226886]: 2026-01-20 15:29:03.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:03.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:03.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:05.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:05.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:07.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:07.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.177 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.760 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:29:08 np0005588920 nova_compute[226886]: 2026-01-20 15:29:08.760 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4223010889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.207 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.389 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.391 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4179MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.391 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.392 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.463 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.463 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.475 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.490 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.490 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.505 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.524 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:29:09 np0005588920 nova_compute[226886]: 2026-01-20 15:29:09.549 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:09 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4249767932' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:09.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:09.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:10 np0005588920 nova_compute[226886]: 2026-01-20 15:29:10.001 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:10 np0005588920 nova_compute[226886]: 2026-01-20 15:29:10.008 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:29:10 np0005588920 nova_compute[226886]: 2026-01-20 15:29:10.034 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:29:10 np0005588920 nova_compute[226886]: 2026-01-20 15:29:10.037 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:29:10 np0005588920 nova_compute[226886]: 2026-01-20 15:29:10.038 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:11.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:11.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:12 np0005588920 nova_compute[226886]: 2026-01-20 15:29:12.038 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:12 np0005588920 nova_compute[226886]: 2026-01-20 15:29:12.039 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:29:12 np0005588920 nova_compute[226886]: 2026-01-20 15:29:12.039 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:29:12 np0005588920 nova_compute[226886]: 2026-01-20 15:29:12.058 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:29:13 np0005588920 nova_compute[226886]: 2026-01-20 15:29:13.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:13 np0005588920 nova_compute[226886]: 2026-01-20 15:29:13.747 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:13.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:13.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:14 np0005588920 nova_compute[226886]: 2026-01-20 15:29:14.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:15.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:15.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:16.491 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:16.492 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:16.492 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:16 np0005588920 nova_compute[226886]: 2026-01-20 15:29:16.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:17.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:17.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:18 np0005588920 nova_compute[226886]: 2026-01-20 15:29:18.194 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:18 np0005588920 nova_compute[226886]: 2026-01-20 15:29:18.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:18 np0005588920 nova_compute[226886]: 2026-01-20 15:29:18.748 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:19 np0005588920 nova_compute[226886]: 2026-01-20 15:29:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:19 np0005588920 nova_compute[226886]: 2026-01-20 15:29:19.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:19.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:19.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:21.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:21.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:22 np0005588920 podman[304634]: 2026-01-20 15:29:22.006031006 +0000 UTC m=+0.084456767 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:29:23 np0005588920 nova_compute[226886]: 2026-01-20 15:29:23.244 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.260962) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963261014, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 1552, "num_deletes": 250, "total_data_size": 3741479, "memory_usage": 3796984, "flush_reason": "Manual Compaction"}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963272438, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 1517641, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76795, "largest_seqno": 78342, "table_properties": {"data_size": 1512548, "index_size": 2424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13383, "raw_average_key_size": 21, "raw_value_size": 1501453, "raw_average_value_size": 2357, "num_data_blocks": 108, "num_entries": 637, "num_filter_entries": 637, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922833, "oldest_key_time": 1768922833, "file_creation_time": 1768922963, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 11510 microseconds, and 4335 cpu microseconds.
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.272478) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 1517641 bytes OK
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.272494) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.274894) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.274940) EVENT_LOG_v1 {"time_micros": 1768922963274931, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.274965) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 3734289, prev total WAL file size 3734289, number of live WAL files 2.
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.275994) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353037' seq:72057594037927935, type:22 .. '6D6772737461740032373538' seq:0, type:0; will stop at (end)
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(1482KB)], [156(12MB)]
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963276031, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 14634656, "oldest_snapshot_seqno": -1}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 10029 keys, 11680142 bytes, temperature: kUnknown
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963349841, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 11680142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11617413, "index_size": 36568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25093, "raw_key_size": 263783, "raw_average_key_size": 26, "raw_value_size": 11443639, "raw_average_value_size": 1141, "num_data_blocks": 1392, "num_entries": 10029, "num_filter_entries": 10029, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768922963, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.350124) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 11680142 bytes
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.356302) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.1 rd, 158.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.5 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(17.3) write-amplify(7.7) OK, records in: 10487, records dropped: 458 output_compression: NoCompression
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.356332) EVENT_LOG_v1 {"time_micros": 1768922963356318, "job": 100, "event": "compaction_finished", "compaction_time_micros": 73892, "compaction_time_cpu_micros": 28156, "output_level": 6, "num_output_files": 1, "total_output_size": 11680142, "num_input_records": 10487, "num_output_records": 10029, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963356769, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963359651, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.275930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359767) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359769) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588920 nova_compute[226886]: 2026-01-20 15:29:23.750 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:23.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:24.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:24 np0005588920 nova_compute[226886]: 2026-01-20 15:29:24.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:25 np0005588920 nova_compute[226886]: 2026-01-20 15:29:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:25 np0005588920 nova_compute[226886]: 2026-01-20 15:29:25.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:29:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:25.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:26.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:26 np0005588920 nova_compute[226886]: 2026-01-20 15:29:26.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:28.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:28.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:28 np0005588920 nova_compute[226886]: 2026-01-20 15:29:28.249 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:28 np0005588920 nova_compute[226886]: 2026-01-20 15:29:28.868 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:30.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:30.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:31 np0005588920 podman[304712]: 2026-01-20 15:29:31.953899412 +0000 UTC m=+0.046115479 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:29:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:32.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:32.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:33 np0005588920 nova_compute[226886]: 2026-01-20 15:29:33.310 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:33 np0005588920 nova_compute[226886]: 2026-01-20 15:29:33.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:33 np0005588920 nova_compute[226886]: 2026-01-20 15:29:33.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:34.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:34.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:34 np0005588920 nova_compute[226886]: 2026-01-20 15:29:34.748 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:34 np0005588920 nova_compute[226886]: 2026-01-20 15:29:34.748 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:29:34 np0005588920 nova_compute[226886]: 2026-01-20 15:29:34.766 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:29:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:36.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:36.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:38.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:38.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:38 np0005588920 nova_compute[226886]: 2026-01-20 15:29:38.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:38 np0005588920 nova_compute[226886]: 2026-01-20 15:29:38.871 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:40.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:40.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:42.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:42.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:43 np0005588920 nova_compute[226886]: 2026-01-20 15:29:43.317 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:43 np0005588920 nova_compute[226886]: 2026-01-20 15:29:43.925 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:44.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:46.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:48.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:48.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:48 np0005588920 nova_compute[226886]: 2026-01-20 15:29:48.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:48 np0005588920 nova_compute[226886]: 2026-01-20 15:29:48.926 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:49 np0005588920 nova_compute[226886]: 2026-01-20 15:29:49.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:49 np0005588920 nova_compute[226886]: 2026-01-20 15:29:49.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:29:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:50.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:50.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:50.439 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:29:50 np0005588920 nova_compute[226886]: 2026-01-20 15:29:50.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:50.440 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:29:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:52.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:52.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:52 np0005588920 podman[304733]: 2026-01-20 15:29:52.988786328 +0000 UTC m=+0.079800175 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 10:29:53 np0005588920 nova_compute[226886]: 2026-01-20 15:29:53.323 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:53 np0005588920 nova_compute[226886]: 2026-01-20 15:29:53.928 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:54.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:54.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:56.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:56.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:58.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:29:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:58.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588920 nova_compute[226886]: 2026-01-20 15:29:58.327 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:58 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:29:58.441 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:58 np0005588920 nova_compute[226886]: 2026-01-20 15:29:58.929 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:00.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:00.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:30:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:02.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:02.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:02 np0005588920 podman[304759]: 2026-01-20 15:30:02.958174664 +0000 UTC m=+0.050129103 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:30:03 np0005588920 nova_compute[226886]: 2026-01-20 15:30:03.331 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:03 np0005588920 nova_compute[226886]: 2026-01-20 15:30:03.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:04.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:06.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:06.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:08.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:08.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:08 np0005588920 nova_compute[226886]: 2026-01-20 15:30:08.334 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:08 np0005588920 nova_compute[226886]: 2026-01-20 15:30:08.931 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.748 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.778 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:30:09 np0005588920 nova_compute[226886]: 2026-01-20 15:30:09.779 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:10.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:10.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:30:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3785469698' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.207 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.358 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.360 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4186MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.360 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.360 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.478 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.479 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.508 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:10 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:30:10 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1052730623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.928 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.936 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.960 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.961 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:30:10 np0005588920 nova_compute[226886]: 2026-01-20 15:30:10.961 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:11 np0005588920 nova_compute[226886]: 2026-01-20 15:30:11.940 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:11 np0005588920 nova_compute[226886]: 2026-01-20 15:30:11.941 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:30:11 np0005588920 nova_compute[226886]: 2026-01-20 15:30:11.941 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:30:11 np0005588920 nova_compute[226886]: 2026-01-20 15:30:11.964 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:30:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:12.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:12.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:13 np0005588920 nova_compute[226886]: 2026-01-20 15:30:13.338 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:13 np0005588920 nova_compute[226886]: 2026-01-20 15:30:13.933 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:14.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:14.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:30:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:16.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:30:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:16.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:16.493 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:16.494 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:16.494 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:16 np0005588920 nova_compute[226886]: 2026-01-20 15:30:16.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:17 np0005588920 nova_compute[226886]: 2026-01-20 15:30:17.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:18.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:18.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:18 np0005588920 nova_compute[226886]: 2026-01-20 15:30:18.343 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:18 np0005588920 nova_compute[226886]: 2026-01-20 15:30:18.935 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:19 np0005588920 nova_compute[226886]: 2026-01-20 15:30:19.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:20.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:20 np0005588920 nova_compute[226886]: 2026-01-20 15:30:20.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:21 np0005588920 nova_compute[226886]: 2026-01-20 15:30:21.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:22.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:22.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:23 np0005588920 nova_compute[226886]: 2026-01-20 15:30:23.371 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:23 np0005588920 nova_compute[226886]: 2026-01-20 15:30:23.936 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:23 np0005588920 podman[304824]: 2026-01-20 15:30:23.981771169 +0000 UTC m=+0.072426885 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:30:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:24.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:24.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:25 np0005588920 nova_compute[226886]: 2026-01-20 15:30:25.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:26.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:26.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:26 np0005588920 nova_compute[226886]: 2026-01-20 15:30:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:26 np0005588920 nova_compute[226886]: 2026-01-20 15:30:26.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:30:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:28.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:28.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:28 np0005588920 nova_compute[226886]: 2026-01-20 15:30:28.376 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:28 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:28 np0005588920 nova_compute[226886]: 2026-01-20 15:30:28.937 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588920 podman[305242]: 2026-01-20 15:30:29.743752788 +0000 UTC m=+0.020723699 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 10:30:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:30:30 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.01974821 +0000 UTC m=+0.296719091 container create 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:30:30 np0005588920 systemd[1]: Started libpod-conmon-5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9.scope.
Jan 20 10:30:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:30.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:30 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:30:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:30.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.477014976 +0000 UTC m=+0.753985957 container init 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.485592499 +0000 UTC m=+0.762563380 container start 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 20 10:30:30 np0005588920 awesome_bassi[305258]: 167 167
Jan 20 10:30:30 np0005588920 systemd[1]: libpod-5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9.scope: Deactivated successfully.
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.513046308 +0000 UTC m=+0.790017289 container attach 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.514625913 +0000 UTC m=+0.791596804 container died 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 10:30:30 np0005588920 systemd[1]: var-lib-containers-storage-overlay-c89c9140ef0075a3aaefc26b896d6ebe07adbd39d89d6526d5a49db482ad4638-merged.mount: Deactivated successfully.
Jan 20 10:30:30 np0005588920 podman[305242]: 2026-01-20 15:30:30.849940387 +0000 UTC m=+1.126911268 container remove 5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_bassi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 20 10:30:30 np0005588920 systemd[1]: libpod-conmon-5260fb463fa76e1a19e0bb0a935c9046e352f5eb64786e085624162ce9e9eca9.scope: Deactivated successfully.
Jan 20 10:30:31 np0005588920 podman[305282]: 2026-01-20 15:30:31.019577251 +0000 UTC m=+0.022938542 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 10:30:31 np0005588920 podman[305282]: 2026-01-20 15:30:31.269945265 +0000 UTC m=+0.273306546 container create 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 10:30:31 np0005588920 systemd[1]: Started libpod-conmon-58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8.scope.
Jan 20 10:30:31 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:30:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b53e76deb6246cd616bb8d596a64ee1b7c07e483f21dd132ab6643452933fc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 10:30:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b53e76deb6246cd616bb8d596a64ee1b7c07e483f21dd132ab6643452933fc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 10:30:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b53e76deb6246cd616bb8d596a64ee1b7c07e483f21dd132ab6643452933fc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 10:30:31 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b53e76deb6246cd616bb8d596a64ee1b7c07e483f21dd132ab6643452933fc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 10:30:31 np0005588920 podman[305282]: 2026-01-20 15:30:31.764614242 +0000 UTC m=+0.767975533 container init 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 10:30:31 np0005588920 podman[305282]: 2026-01-20 15:30:31.771030234 +0000 UTC m=+0.774391535 container start 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 20 10:30:31 np0005588920 podman[305282]: 2026-01-20 15:30:31.924725225 +0000 UTC m=+0.928086496 container attach 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:30:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:32.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:32.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]: [
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:    {
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "available": false,
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "ceph_device": false,
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "lsm_data": {},
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "lvs": [],
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "path": "/dev/sr0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "rejected_reasons": [
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "Insufficient space (<5GB)",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "Has a FileSystem"
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        ],
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        "sys_api": {
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "actuators": null,
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "device_nodes": "sr0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "devname": "sr0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "human_readable_size": "482.00 KB",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "id_bus": "ata",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "model": "QEMU DVD-ROM",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "nr_requests": "2",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "parent": "/dev/sr0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "partitions": {},
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "path": "/dev/sr0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "removable": "1",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "rev": "2.5+",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "ro": "0",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "rotational": "1",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "sas_address": "",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "sas_device_handle": "",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "scheduler_mode": "mq-deadline",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "sectors": 0,
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "sectorsize": "2048",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "size": 493568.0,
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "support_discard": "2048",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "type": "disk",
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:            "vendor": "QEMU"
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:        }
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]:    }
Jan 20 10:30:32 np0005588920 kind_mahavira[305298]: ]
Jan 20 10:30:32 np0005588920 systemd[1]: libpod-58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8.scope: Deactivated successfully.
Jan 20 10:30:32 np0005588920 podman[305282]: 2026-01-20 15:30:32.917663501 +0000 UTC m=+1.921024782 container died 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:30:32 np0005588920 systemd[1]: libpod-58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8.scope: Consumed 1.157s CPU time.
Jan 20 10:30:33 np0005588920 systemd[1]: var-lib-containers-storage-overlay-d2b53e76deb6246cd616bb8d596a64ee1b7c07e483f21dd132ab6643452933fc-merged.mount: Deactivated successfully.
Jan 20 10:30:33 np0005588920 podman[305282]: 2026-01-20 15:30:33.251302717 +0000 UTC m=+2.254663988 container remove 58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_mahavira, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True)
Jan 20 10:30:33 np0005588920 systemd[1]: libpod-conmon-58607c4007337f7cc7c579adeb662d5a290fd379d6c7802dbb0e44eae2d741a8.scope: Deactivated successfully.
Jan 20 10:30:33 np0005588920 podman[306568]: 2026-01-20 15:30:33.293253927 +0000 UTC m=+0.152574310 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:30:33 np0005588920 nova_compute[226886]: 2026-01-20 15:30:33.381 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:33 np0005588920 nova_compute[226886]: 2026-01-20 15:30:33.939 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:34.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:34.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:30:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:30:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:36.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:36.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:38.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:38.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:38 np0005588920 nova_compute[226886]: 2026-01-20 15:30:38.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:38 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:38 np0005588920 nova_compute[226886]: 2026-01-20 15:30:38.941 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:40.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:40.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:42.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:42.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:43 np0005588920 nova_compute[226886]: 2026-01-20 15:30:42.998 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:43 np0005588920 nova_compute[226886]: 2026-01-20 15:30:43.389 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:43 np0005588920 nova_compute[226886]: 2026-01-20 15:30:43.987 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:44.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:44.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.737140) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044737234, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 1070, "num_deletes": 251, "total_data_size": 2334552, "memory_usage": 2373920, "flush_reason": "Manual Compaction"}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044763598, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 1521192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 78347, "largest_seqno": 79412, "table_properties": {"data_size": 1516323, "index_size": 2456, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10746, "raw_average_key_size": 20, "raw_value_size": 1506527, "raw_average_value_size": 2805, "num_data_blocks": 107, "num_entries": 537, "num_filter_entries": 537, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922964, "oldest_key_time": 1768922964, "file_creation_time": 1768923044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 26503 microseconds, and 4854 cpu microseconds.
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.763650) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 1521192 bytes OK
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.763673) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.764995) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765011) EVENT_LOG_v1 {"time_micros": 1768923044765005, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765030) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 2329283, prev total WAL file size 2329283, number of live WAL files 2.
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765801) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(1485KB)], [159(11MB)]
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044765865, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13201334, "oldest_snapshot_seqno": -1}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 10047 keys, 11280749 bytes, temperature: kUnknown
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044838453, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 11280749, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11218195, "index_size": 36329, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25157, "raw_key_size": 264851, "raw_average_key_size": 26, "raw_value_size": 11044427, "raw_average_value_size": 1099, "num_data_blocks": 1377, "num_entries": 10047, "num_filter_entries": 10047, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.838726) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 11280749 bytes
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.840106) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.6 rd, 155.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.1 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(16.1) write-amplify(7.4) OK, records in: 10566, records dropped: 519 output_compression: NoCompression
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.840122) EVENT_LOG_v1 {"time_micros": 1768923044840115, "job": 102, "event": "compaction_finished", "compaction_time_micros": 72680, "compaction_time_cpu_micros": 25415, "output_level": 6, "num_output_files": 1, "total_output_size": 11280749, "num_input_records": 10566, "num_output_records": 10047, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044840446, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044842605, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.842673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.842677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.842679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.842680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:30:44.842681) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:46.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:46.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:48.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:48.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:48 np0005588920 nova_compute[226886]: 2026-01-20 15:30:48.392 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:48 np0005588920 nova_compute[226886]: 2026-01-20 15:30:48.989 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:50.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:50.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:50.869 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:30:50 np0005588920 nova_compute[226886]: 2026-01-20 15:30:50.869 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:50.870 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:30:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:30:51.871 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:52.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:52.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:53 np0005588920 nova_compute[226886]: 2026-01-20 15:30:53.396 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:53 np0005588920 nova_compute[226886]: 2026-01-20 15:30:53.991 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:54.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:54.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:55 np0005588920 podman[306639]: 2026-01-20 15:30:55.04272797 +0000 UTC m=+0.116766504 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:30:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:56.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:58.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:30:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:58.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:58 np0005588920 nova_compute[226886]: 2026-01-20 15:30:58.445 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:58 np0005588920 nova_compute[226886]: 2026-01-20 15:30:58.992 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:00.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:00.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:02.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:02.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:03 np0005588920 nova_compute[226886]: 2026-01-20 15:31:03.449 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:03 np0005588920 podman[306667]: 2026-01-20 15:31:03.531957676 +0000 UTC m=+0.049210207 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 10:31:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:03 np0005588920 nova_compute[226886]: 2026-01-20 15:31:03.996 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:04.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:04.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:06.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:06.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:08.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:08.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:08 np0005588920 nova_compute[226886]: 2026-01-20 15:31:08.453 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:09 np0005588920 nova_compute[226886]: 2026-01-20 15:31:09.000 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:10.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:10.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.764 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:31:10 np0005588920 nova_compute[226886]: 2026-01-20 15:31:10.764 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:31:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/165161131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.186 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.320 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.321 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4166MB free_disk=20.92182159423828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.321 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.322 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.387 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.387 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.412 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:31:11 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:11 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3566983214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.829 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.834 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.865 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.866 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:31:11 np0005588920 nova_compute[226886]: 2026-01-20 15:31:11.867 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:12.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:12.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:13 np0005588920 nova_compute[226886]: 2026-01-20 15:31:13.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:13 np0005588920 nova_compute[226886]: 2026-01-20 15:31:13.867 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:13 np0005588920 nova_compute[226886]: 2026-01-20 15:31:13.867 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:31:13 np0005588920 nova_compute[226886]: 2026-01-20 15:31:13.868 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:31:13 np0005588920 nova_compute[226886]: 2026-01-20 15:31:13.890 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:31:14 np0005588920 nova_compute[226886]: 2026-01-20 15:31:14.001 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:14.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:16.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:16.494 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:16.494 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:16.495 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:18.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:18 np0005588920 nova_compute[226886]: 2026-01-20 15:31:18.500 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:18 np0005588920 nova_compute[226886]: 2026-01-20 15:31:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:18 np0005588920 nova_compute[226886]: 2026-01-20 15:31:18.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:19 np0005588920 nova_compute[226886]: 2026-01-20 15:31:19.004 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:20.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:20.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:20 np0005588920 nova_compute[226886]: 2026-01-20 15:31:20.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:22.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:22.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:22 np0005588920 nova_compute[226886]: 2026-01-20 15:31:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:23 np0005588920 nova_compute[226886]: 2026-01-20 15:31:23.505 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:23 np0005588920 nova_compute[226886]: 2026-01-20 15:31:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:24 np0005588920 nova_compute[226886]: 2026-01-20 15:31:24.007 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:24.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:24.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:26 np0005588920 podman[306732]: 2026-01-20 15:31:26.015075316 +0000 UTC m=+0.104901437 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:31:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:26.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:26.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:27 np0005588920 nova_compute[226886]: 2026-01-20 15:31:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:28.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:28.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:28 np0005588920 nova_compute[226886]: 2026-01-20 15:31:28.508 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:28 np0005588920 nova_compute[226886]: 2026-01-20 15:31:28.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:28 np0005588920 nova_compute[226886]: 2026-01-20 15:31:28.738 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:28 np0005588920 nova_compute[226886]: 2026-01-20 15:31:28.739 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:31:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:29 np0005588920 nova_compute[226886]: 2026-01-20 15:31:29.008 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:30.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:30.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:32.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:32.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:33 np0005588920 nova_compute[226886]: 2026-01-20 15:31:33.512 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:33 np0005588920 podman[306758]: 2026-01-20 15:31:33.959013748 +0000 UTC m=+0.047673412 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:31:34 np0005588920 nova_compute[226886]: 2026-01-20 15:31:34.010 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:34.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:34.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:36.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:36.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:38.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:38.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:38 np0005588920 nova_compute[226886]: 2026-01-20 15:31:38.515 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:39 np0005588920 nova_compute[226886]: 2026-01-20 15:31:39.012 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:40.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:40.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:31:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:31:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:31:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:41.687 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:31:41 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:41.687 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:31:41 np0005588920 nova_compute[226886]: 2026-01-20 15:31:41.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:42.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:42.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:43 np0005588920 nova_compute[226886]: 2026-01-20 15:31:43.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:44 np0005588920 nova_compute[226886]: 2026-01-20 15:31:44.067 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:44.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:44.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:46.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:46.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:46 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:48.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:48 np0005588920 nova_compute[226886]: 2026-01-20 15:31:48.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:49 np0005588920 nova_compute[226886]: 2026-01-20 15:31:49.069 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:49 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:31:49.690 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:31:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:50.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:50.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:52.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:52.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:53 np0005588920 nova_compute[226886]: 2026-01-20 15:31:53.525 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:54 np0005588920 nova_compute[226886]: 2026-01-20 15:31:54.071 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:54.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:54.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:56.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:56.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:56 np0005588920 podman[306963]: 2026-01-20 15:31:56.98314342 +0000 UTC m=+0.073595749 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 20 10:31:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:58.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:31:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:58.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:58 np0005588920 nova_compute[226886]: 2026-01-20 15:31:58.574 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:59 np0005588920 nova_compute[226886]: 2026-01-20 15:31:59.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:00.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:00.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:02.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:02.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:03 np0005588920 nova_compute[226886]: 2026-01-20 15:32:03.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:04 np0005588920 nova_compute[226886]: 2026-01-20 15:32:04.076 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:04.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:04.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:04 np0005588920 podman[306989]: 2026-01-20 15:32:04.958414603 +0000 UTC m=+0.046820779 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:32:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:06.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:06.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:08.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:08.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:08 np0005588920 nova_compute[226886]: 2026-01-20 15:32:08.582 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:09 np0005588920 nova_compute[226886]: 2026-01-20 15:32:09.133 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:10.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:10.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:12.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:12.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.753 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.754 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:32:12 np0005588920 nova_compute[226886]: 2026-01-20 15:32:12.754 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/779206332' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.179 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.328 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.329 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4171MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.329 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.330 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.434 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.434 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.515 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050766060' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050766060' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:32:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1817760916' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.936 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.942 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.968 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.970 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:32:13 np0005588920 nova_compute[226886]: 2026-01-20 15:32:13.970 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:14 np0005588920 nova_compute[226886]: 2026-01-20 15:32:14.135 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:14.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:14.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:15 np0005588920 nova_compute[226886]: 2026-01-20 15:32:15.970 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:15 np0005588920 nova_compute[226886]: 2026-01-20 15:32:15.971 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:32:15 np0005588920 nova_compute[226886]: 2026-01-20 15:32:15.971 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:32:15 np0005588920 nova_compute[226886]: 2026-01-20 15:32:15.996 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:32:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:16.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:16.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:16.496 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:16.496 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:16.496 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:18.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:18.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:18 np0005588920 nova_compute[226886]: 2026-01-20 15:32:18.588 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:18 np0005588920 nova_compute[226886]: 2026-01-20 15:32:18.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:19 np0005588920 nova_compute[226886]: 2026-01-20 15:32:19.136 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:19 np0005588920 nova_compute[226886]: 2026-01-20 15:32:19.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:20.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:20.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:20 np0005588920 nova_compute[226886]: 2026-01-20 15:32:20.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:22.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:22.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:22 np0005588920 nova_compute[226886]: 2026-01-20 15:32:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:23 np0005588920 nova_compute[226886]: 2026-01-20 15:32:23.636 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:23 np0005588920 nova_compute[226886]: 2026-01-20 15:32:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:24 np0005588920 nova_compute[226886]: 2026-01-20 15:32:24.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:24.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:24.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:26.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:26.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:27 np0005588920 podman[307054]: 2026-01-20 15:32:27.985989753 +0000 UTC m=+0.077169341 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 20 10:32:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:28.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:28.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:28 np0005588920 nova_compute[226886]: 2026-01-20 15:32:28.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:28 np0005588920 nova_compute[226886]: 2026-01-20 15:32:28.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:29 np0005588920 nova_compute[226886]: 2026-01-20 15:32:29.140 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:30.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:30.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:30 np0005588920 nova_compute[226886]: 2026-01-20 15:32:30.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:30 np0005588920 nova_compute[226886]: 2026-01-20 15:32:30.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:32:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:32.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:32.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:32.510 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:32:32 np0005588920 nova_compute[226886]: 2026-01-20 15:32:32.510 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:32 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:32.511 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.313758) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153313814, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 1258, "num_deletes": 256, "total_data_size": 2757196, "memory_usage": 2790384, "flush_reason": "Manual Compaction"}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153330594, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1819587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79417, "largest_seqno": 80670, "table_properties": {"data_size": 1814121, "index_size": 2861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11617, "raw_average_key_size": 19, "raw_value_size": 1803194, "raw_average_value_size": 3025, "num_data_blocks": 127, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923045, "oldest_key_time": 1768923045, "file_creation_time": 1768923153, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 16873 microseconds, and 4467 cpu microseconds.
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.330635) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1819587 bytes OK
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.330651) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332559) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332600) EVENT_LOG_v1 {"time_micros": 1768923153332592, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332623) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 2751218, prev total WAL file size 2751218, number of live WAL files 2.
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.333477) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303232' seq:72057594037927935, type:22 .. '6C6F676D0033323734' seq:0, type:0; will stop at (end)
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1776KB)], [162(10MB)]
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153333533, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13100336, "oldest_snapshot_seqno": -1}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 10118 keys, 12969066 bytes, temperature: kUnknown
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153482584, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 12969066, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12903989, "index_size": 38660, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 267289, "raw_average_key_size": 26, "raw_value_size": 12726957, "raw_average_value_size": 1257, "num_data_blocks": 1473, "num_entries": 10118, "num_filter_entries": 10118, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923153, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.482833) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 12969066 bytes
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.485552) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.9 rd, 87.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.8 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.3) write-amplify(7.1) OK, records in: 10643, records dropped: 525 output_compression: NoCompression
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.485568) EVENT_LOG_v1 {"time_micros": 1768923153485560, "job": 104, "event": "compaction_finished", "compaction_time_micros": 149118, "compaction_time_cpu_micros": 30860, "output_level": 6, "num_output_files": 1, "total_output_size": 12969066, "num_input_records": 10643, "num_output_records": 10118, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153485947, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153487893, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.333394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588920 nova_compute[226886]: 2026-01-20 15:32:33.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:34 np0005588920 nova_compute[226886]: 2026-01-20 15:32:34.139 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:34.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:35 np0005588920 podman[307082]: 2026-01-20 15:32:35.988014985 +0000 UTC m=+0.073093935 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:32:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:36.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:36.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:38.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:38.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:38 np0005588920 nova_compute[226886]: 2026-01-20 15:32:38.644 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:39 np0005588920 nova_compute[226886]: 2026-01-20 15:32:39.140 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:40.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:40.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:42.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:42 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:32:42.513 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:43 np0005588920 nova_compute[226886]: 2026-01-20 15:32:43.689 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:44 np0005588920 nova_compute[226886]: 2026-01-20 15:32:44.142 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:44.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:44.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:46.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:46.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:47 np0005588920 podman[307276]: 2026-01-20 15:32:47.230604019 +0000 UTC m=+0.072468817 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 20 10:32:47 np0005588920 podman[307276]: 2026-01-20 15:32:47.329068623 +0000 UTC m=+0.170933411 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:32:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:48.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:48.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:48 np0005588920 nova_compute[226886]: 2026-01-20 15:32:48.693 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:49 np0005588920 nova_compute[226886]: 2026-01-20 15:32:49.144 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:32:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:32:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:50.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:50.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:52.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:52.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:53 np0005588920 nova_compute[226886]: 2026-01-20 15:32:53.696 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:54 np0005588920 nova_compute[226886]: 2026-01-20 15:32:54.145 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:54.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:54.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:56.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:56.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:58.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:32:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:58 np0005588920 nova_compute[226886]: 2026-01-20 15:32:58.700 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:58 np0005588920 podman[307576]: 2026-01-20 15:32:58.990378689 +0000 UTC m=+0.074145824 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 20 10:32:59 np0005588920 nova_compute[226886]: 2026-01-20 15:32:59.147 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:00.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:00.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:02.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:02.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:03 np0005588920 nova_compute[226886]: 2026-01-20 15:33:03.703 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:04 np0005588920 nova_compute[226886]: 2026-01-20 15:33:04.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:04.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:04.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:06.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:06.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:06 np0005588920 podman[307602]: 2026-01-20 15:33:06.957459649 +0000 UTC m=+0.049998359 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:33:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:08.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:08.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:08 np0005588920 nova_compute[226886]: 2026-01-20 15:33:08.707 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:09 np0005588920 nova_compute[226886]: 2026-01-20 15:33:09.152 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:10.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:10.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:12.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:12.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.711 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.762 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.763 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.763 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:33:13 np0005588920 nova_compute[226886]: 2026-01-20 15:33:13.764 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:33:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.154 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:33:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2273594203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.198 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:33:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:14.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:14.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.397 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.398 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4136MB free_disk=20.897228240966797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.399 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.399 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.511 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.512 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.548 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:33:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:33:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/959369576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.966 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.972 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.993 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.995 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:33:14 np0005588920 nova_compute[226886]: 2026-01-20 15:33:14.995 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:16.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:16.497 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:16.497 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:16.497 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:17 np0005588920 nova_compute[226886]: 2026-01-20 15:33:17.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:17 np0005588920 nova_compute[226886]: 2026-01-20 15:33:17.997 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:33:17 np0005588920 nova_compute[226886]: 2026-01-20 15:33:17.997 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:33:18 np0005588920 nova_compute[226886]: 2026-01-20 15:33:18.014 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:33:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:18.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:18 np0005588920 nova_compute[226886]: 2026-01-20 15:33:18.714 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:19 np0005588920 nova_compute[226886]: 2026-01-20 15:33:19.156 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:19 np0005588920 nova_compute[226886]: 2026-01-20 15:33:19.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588920 nova_compute[226886]: 2026-01-20 15:33:19.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:20.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:20.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:20 np0005588920 nova_compute[226886]: 2026-01-20 15:33:20.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:22.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:22 np0005588920 nova_compute[226886]: 2026-01-20 15:33:22.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:23 np0005588920 nova_compute[226886]: 2026-01-20 15:33:23.718 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:24 np0005588920 nova_compute[226886]: 2026-01-20 15:33:24.157 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:24.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:24.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:24 np0005588920 nova_compute[226886]: 2026-01-20 15:33:24.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:26.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:26.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:27.747 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:33:27 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:27.748 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:33:27 np0005588920 nova_compute[226886]: 2026-01-20 15:33:27.749 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:28.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:28.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:28 np0005588920 nova_compute[226886]: 2026-01-20 15:33:28.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:28 np0005588920 nova_compute[226886]: 2026-01-20 15:33:28.721 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:29 np0005588920 nova_compute[226886]: 2026-01-20 15:33:29.159 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:29 np0005588920 nova_compute[226886]: 2026-01-20 15:33:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:29 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:33:29.750 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:33:29 np0005588920 podman[307667]: 2026-01-20 15:33:29.984116062 +0000 UTC m=+0.075170704 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:33:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:30.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:31 np0005588920 nova_compute[226886]: 2026-01-20 15:33:31.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:31 np0005588920 nova_compute[226886]: 2026-01-20 15:33:31.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:33:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:32.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:32.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:33 np0005588920 nova_compute[226886]: 2026-01-20 15:33:33.724 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:34 np0005588920 nova_compute[226886]: 2026-01-20 15:33:34.162 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:34.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:34.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:36.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:36.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:37 np0005588920 podman[307693]: 2026-01-20 15:33:37.990467447 +0000 UTC m=+0.074101883 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 10:33:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:38.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:38 np0005588920 nova_compute[226886]: 2026-01-20 15:33:38.727 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:39 np0005588920 nova_compute[226886]: 2026-01-20 15:33:39.163 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:40.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:40.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:42.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:42.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:43 np0005588920 nova_compute[226886]: 2026-01-20 15:33:43.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:43 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:44 np0005588920 nova_compute[226886]: 2026-01-20 15:33:44.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:44.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:44.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:46.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:48.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:48 np0005588920 nova_compute[226886]: 2026-01-20 15:33:48.735 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:48 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:49 np0005588920 nova_compute[226886]: 2026-01-20 15:33:49.166 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:50.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:52.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:52.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:53 np0005588920 nova_compute[226886]: 2026-01-20 15:33:53.739 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:53 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:54 np0005588920 nova_compute[226886]: 2026-01-20 15:33:54.168 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:54.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:54.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:56.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:56.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:33:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:33:57 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:33:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:58.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:33:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:58.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:58 np0005588920 nova_compute[226886]: 2026-01-20 15:33:58.742 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:58 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:59 np0005588920 nova_compute[226886]: 2026-01-20 15:33:59.169 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:00.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:00.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:00 np0005588920 podman[307844]: 2026-01-20 15:34:00.990407642 +0000 UTC m=+0.078224990 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:34:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:02.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:02.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:03 np0005588920 nova_compute[226886]: 2026-01-20 15:34:03.746 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:03 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:04 np0005588920 nova_compute[226886]: 2026-01-20 15:34:04.170 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:04.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:34:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:34:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:04.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:06.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:06.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:07 np0005588920 nova_compute[226886]: 2026-01-20 15:34:07.943 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:07 np0005588920 nova_compute[226886]: 2026-01-20 15:34:07.943 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:07 np0005588920 nova_compute[226886]: 2026-01-20 15:34:07.964 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.054 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.054 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.063 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.064 226890 INFO nova.compute.claims [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.238 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:08.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:08.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:08 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1657604232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.671 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.677 226890 DEBUG nova.compute.provider_tree [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.697 226890 DEBUG nova.scheduler.client.report [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.739 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.740 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.750 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.854 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.855 226890 DEBUG nova.network.neutron [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.874 226890 INFO nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.891 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:34:08 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:08 np0005588920 podman[307944]: 2026-01-20 15:34:08.952503161 +0000 UTC m=+0.042499276 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.990 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.991 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:34:08 np0005588920 nova_compute[226886]: 2026-01-20 15:34:08.992 226890 INFO nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Creating image(s)#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.012 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.034 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.057 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.060 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.142 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.144 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.145 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.145 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.168 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.174 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.201 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.879 226890 DEBUG nova.policy [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:34:09 np0005588920 nova_compute[226886]: 2026-01-20 15:34:09.975 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.802s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.037 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.146 226890 DEBUG nova.objects.instance [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.177 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.178 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Ensure instance console log exists: /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.178 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.179 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:10 np0005588920 nova_compute[226886]: 2026-01-20 15:34:10.179 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:10.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:10.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.013 226890 DEBUG nova.network.neutron [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Successfully updated port: 1dee9c67-fb01-4fcd-8f35-805a326ee235 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.032 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.033 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.033 226890 DEBUG nova.network.neutron [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.172 226890 DEBUG nova.compute.manager [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.172 226890 DEBUG nova.compute.manager [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Refreshing instance network info cache due to event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.172 226890 DEBUG oslo_concurrency.lockutils [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:11 np0005588920 nova_compute[226886]: 2026-01-20 15:34:11.619 226890 DEBUG nova.network.neutron [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:34:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:12.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:12.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.058 226890 DEBUG nova.network.neutron [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.096 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.096 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Instance network_info: |[{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.097 226890 DEBUG oslo_concurrency.lockutils [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.097 226890 DEBUG nova.network.neutron [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Refreshing network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.100 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Start _get_guest_xml network_info=[{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.104 226890 WARNING nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.111 226890 DEBUG nova.virt.libvirt.host [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.111 226890 DEBUG nova.virt.libvirt.host [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.121 226890 DEBUG nova.virt.libvirt.host [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.121 226890 DEBUG nova.virt.libvirt.host [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.122 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.122 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.123 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.123 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.123 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.124 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.124 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.124 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.124 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.125 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.125 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.125 226890 DEBUG nova.virt.hardware [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.128 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:34:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4278523668' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.564 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.590 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.594 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.750 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.751 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.751 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:13 np0005588920 nova_compute[226886]: 2026-01-20 15:34:13.778 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:34:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2140012008' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.012 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.014 226890 DEBUG nova.virt.libvirt.vif [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:34:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-385886117',display_name='tempest-TestNetworkBasicOps-server-385886117',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-385886117',id=210,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4egywmUgiowaHLTJn1nErdfaP0oMdhQmIgUXn8uXQUfQZSuklJDV6MWPtY7LPTEUS5qVJRUYQY1UGkGkHNZraRamn/90IKm/HwKOMLNLsEuUfRLsBa0wZUmsHoCCyjuQ==',key_name='tempest-TestNetworkBasicOps-841590218',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-a4liy8zm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:34:08Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.014 226890 DEBUG nova.network.os_vif_util [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.015 226890 DEBUG nova.network.os_vif_util [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.016 226890 DEBUG nova.objects.instance [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.038 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <uuid>bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf</uuid>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <name>instance-000000d2</name>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkBasicOps-server-385886117</nova:name>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:34:13</nova:creationTime>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <nova:port uuid="1dee9c67-fb01-4fcd-8f35-805a326ee235">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="serial">bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="uuid">bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:ea:3a:75"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <target dev="tap1dee9c67-fb"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/console.log" append="off"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:34:14 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:34:14 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:34:14 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:34:14 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.040 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Preparing to wait for external event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.041 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.041 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.041 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.042 226890 DEBUG nova.virt.libvirt.vif [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:34:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-385886117',display_name='tempest-TestNetworkBasicOps-server-385886117',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-385886117',id=210,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4egywmUgiowaHLTJn1nErdfaP0oMdhQmIgUXn8uXQUfQZSuklJDV6MWPtY7LPTEUS5qVJRUYQY1UGkGkHNZraRamn/90IKm/HwKOMLNLsEuUfRLsBa0wZUmsHoCCyjuQ==',key_name='tempest-TestNetworkBasicOps-841590218',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-a4liy8zm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:34:08Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.042 226890 DEBUG nova.network.os_vif_util [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.043 226890 DEBUG nova.network.os_vif_util [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.044 226890 DEBUG os_vif [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.045 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.046 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.049 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.049 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1dee9c67-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.050 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1dee9c67-fb, col_values=(('external_ids', {'iface-id': '1dee9c67-fb01-4fcd-8f35-805a326ee235', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:3a:75', 'vm-uuid': 'bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:14 np0005588920 NetworkManager[49076]: <info>  [1768923254.0532] manager: (tap1dee9c67-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/444)
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.060 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.062 226890 INFO os_vif [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb')#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.150 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.150 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.150 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:ea:3a:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.151 226890 INFO nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Using config drive#033[00m
Jan 20 10:34:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/650000176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.176 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.182 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.185 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.322 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.323 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:34:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:14.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.454 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.455 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4107MB free_disk=20.968605041503906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.455 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.456 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:14.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.544 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.544 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.544 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.569 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.618 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.619 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.649 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.695 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.790 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.930 226890 INFO nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Creating config drive at /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config#033[00m
Jan 20 10:34:14 np0005588920 nova_compute[226886]: 2026-01-20 15:34:14.936 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8vqbzdkk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.090 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8vqbzdkk" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.125 226890 DEBUG nova.storage.rbd_utils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.129 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.165 226890 DEBUG nova.network.neutron [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updated VIF entry in instance network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.166 226890 DEBUG nova.network.neutron [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.192 226890 DEBUG oslo_concurrency.lockutils [req-869dfcfb-22eb-48f1-879c-3a7713dbbeba req-f71b5844-f711-479c-bec3-8a3adab06758 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1244062530' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.241 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.246 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.283 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.330 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.331 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.440 226890 DEBUG oslo_concurrency.processutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.441 226890 INFO nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Deleting local config drive /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf/disk.config because it was imported into RBD.#033[00m
Jan 20 10:34:15 np0005588920 kernel: tap1dee9c67-fb: entered promiscuous mode
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.4972] manager: (tap1dee9c67-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/445)
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.496 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:15Z|00950|binding|INFO|Claiming lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 for this chassis.
Jan 20 10:34:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:15Z|00951|binding|INFO|1dee9c67-fb01-4fcd-8f35-805a326ee235: Claiming fa:16:3e:ea:3a:75 10.100.0.12
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.513 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:3a:75 10.100.0.12'], port_security=['fa:16:3e:ea:3a:75 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99dd5684-1685-443e-9373-f548d80784f6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d2010b-16ff-4152-8c6b-d6e8ffb1b3ca, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=1dee9c67-fb01-4fcd-8f35-805a326ee235) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.514 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 1dee9c67-fb01-4fcd-8f35-805a326ee235 in datapath 99dd5684-1685-443e-9373-f548d80784f6 bound to our chassis#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.515 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99dd5684-1685-443e-9373-f548d80784f6#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.524 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[248c3457-f1d3-4547-a443-9b038ffaa6ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.525 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99dd5684-11 in ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.526 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99dd5684-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.527 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[726cd485-ea22-438a-81dd-f73033bf777b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.527 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[45520fdd-4679-4f4d-bd9d-067b48429fc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 systemd-udevd[308309]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.539 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[8b04c8bc-24b0-42d4-893c-da02b853ccc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 systemd-machined[196121]: New machine qemu-98-instance-000000d2.
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.5492] device (tap1dee9c67-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.5503] device (tap1dee9c67-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:34:15 np0005588920 systemd[1]: Started Virtual Machine qemu-98-instance-000000d2.
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.564 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[78a54b75-ec2a-4905-bfcd-2286cb2d451f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.593 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.598 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f8fc38-1159-4447-9515-6366a1458056]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:15Z|00952|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 ovn-installed in OVS
Jan 20 10:34:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:15Z|00953|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 up in Southbound
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.6062] manager: (tap99dd5684-10): new Veth device (/org/freedesktop/NetworkManager/Devices/446)
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.605 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8c2415-c7da-45f4-a922-d116ab376614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.643 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[e984221c-7a84-484e-8061-9fb28591969b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.646 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[3e84ef15-335d-4ee2-913f-7e58dc56f16c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.6677] device (tap99dd5684-10): carrier: link connected
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.673 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[84d919ab-7006-4a6b-899d-3647e440dea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.690 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[686bf61f-a517-4e63-92b1-7d51bec36dfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99dd5684-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:88:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829735, 'reachable_time': 26007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308342, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.705 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[04202ca8-b6fb-41d5-b867-749a62abb315]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:8896'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 829735, 'tstamp': 829735}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308343, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.722 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a78fbfe5-5004-40c2-b05d-bb1fa5b48f89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99dd5684-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:88:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829735, 'reachable_time': 26007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308344, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.748 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[6f92cc4b-a80e-471f-b241-5ea23a0b4002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.806 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcb14f5-b939-4ed6-9593-7fc0ef3b29be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.808 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99dd5684-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.808 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.809 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99dd5684-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.810 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 NetworkManager[49076]: <info>  [1768923255.8114] manager: (tap99dd5684-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/447)
Jan 20 10:34:15 np0005588920 kernel: tap99dd5684-10: entered promiscuous mode
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.812 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.817 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99dd5684-10, col_values=(('external_ids', {'iface-id': 'b36be382-7937-4c5c-b0f7-fc4a6e68a050'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.818 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:15Z|00954|binding|INFO|Releasing lport b36be382-7937-4c5c-b0f7-fc4a6e68a050 from this chassis (sb_readonly=0)
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.819 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.820 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[db2f1ec8-866c-47ee-90fa-e3794c0a850b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.821 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-99dd5684-1685-443e-9373-f548d80784f6
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 99dd5684-1685-443e-9373-f548d80784f6
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:34:15 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:15.821 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'env', 'PROCESS_TAG=haproxy-99dd5684-1685-443e-9373-f548d80784f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99dd5684-1685-443e-9373-f548d80784f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.830 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.957 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923255.9565227, bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.957 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] VM Started (Lifecycle Event)#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.976 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.981 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923255.956744, bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:15 np0005588920 nova_compute[226886]: 2026-01-20 15:34:15.981 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:34:16 np0005588920 nova_compute[226886]: 2026-01-20 15:34:16.002 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:16 np0005588920 nova_compute[226886]: 2026-01-20 15:34:16.006 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:34:16 np0005588920 nova_compute[226886]: 2026-01-20 15:34:16.026 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:34:16 np0005588920 podman[308419]: 2026-01-20 15:34:16.163347412 +0000 UTC m=+0.019648098 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:34:16 np0005588920 podman[308419]: 2026-01-20 15:34:16.268382683 +0000 UTC m=+0.124683339 container create 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:34:16 np0005588920 systemd[1]: Started libpod-conmon-10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f.scope.
Jan 20 10:34:16 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:34:16 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf80396d8a3c738552098e4cf208194b0ca28c3298ac8f1bb18aded9ee83566d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:34:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:16.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:16 np0005588920 podman[308419]: 2026-01-20 15:34:16.465122636 +0000 UTC m=+0.321423342 container init 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:34:16 np0005588920 podman[308419]: 2026-01-20 15:34:16.470786356 +0000 UTC m=+0.327087022 container start 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:34:16 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [NOTICE]   (308439) : New worker (308441) forked
Jan 20 10:34:16 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [NOTICE]   (308439) : Loading success.
Jan 20 10:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:16.499 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:16.500 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:16.500 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:16.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.647 226890 DEBUG nova.compute.manager [req-e7fcc701-35c2-45a8-975f-449cdcf49838 req-731e1bb6-78f0-486c-bec7-9dc2c5f1ea71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.648 226890 DEBUG oslo_concurrency.lockutils [req-e7fcc701-35c2-45a8-975f-449cdcf49838 req-731e1bb6-78f0-486c-bec7-9dc2c5f1ea71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.648 226890 DEBUG oslo_concurrency.lockutils [req-e7fcc701-35c2-45a8-975f-449cdcf49838 req-731e1bb6-78f0-486c-bec7-9dc2c5f1ea71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.648 226890 DEBUG oslo_concurrency.lockutils [req-e7fcc701-35c2-45a8-975f-449cdcf49838 req-731e1bb6-78f0-486c-bec7-9dc2c5f1ea71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.649 226890 DEBUG nova.compute.manager [req-e7fcc701-35c2-45a8-975f-449cdcf49838 req-731e1bb6-78f0-486c-bec7-9dc2c5f1ea71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Processing event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.649 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.653 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923257.6535695, bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.654 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.655 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.659 226890 INFO nova.virt.libvirt.driver [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Instance spawned successfully.#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.659 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.684 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.689 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.693 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.694 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.694 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.694 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.695 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.695 226890 DEBUG nova.virt.libvirt.driver [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.729 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.798 226890 INFO nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Took 8.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.799 226890 DEBUG nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.888 226890 INFO nova.compute.manager [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Took 9.88 seconds to build instance.#033[00m
Jan 20 10:34:17 np0005588920 nova_compute[226886]: 2026-01-20 15:34:17.923 226890 DEBUG oslo_concurrency.lockutils [None req-23d84119-5805-4df8-9eb3-62b01df5d1db 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:18.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:18.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.052 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.176 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.331 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.331 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.331 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.620 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.621 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.621 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:34:19 np0005588920 nova_compute[226886]: 2026-01-20 15:34:19.621 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.658725) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259658783, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1355, "num_deletes": 251, "total_data_size": 3006932, "memory_usage": 3054184, "flush_reason": "Manual Compaction"}
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259814526, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 1962624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80675, "largest_seqno": 82025, "table_properties": {"data_size": 1956773, "index_size": 3181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12789, "raw_average_key_size": 20, "raw_value_size": 1944902, "raw_average_value_size": 3062, "num_data_blocks": 140, "num_entries": 635, "num_filter_entries": 635, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923153, "oldest_key_time": 1768923153, "file_creation_time": 1768923259, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 155838 microseconds, and 4710 cpu microseconds.
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.814571) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 1962624 bytes OK
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.814589) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.818912) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.818928) EVENT_LOG_v1 {"time_micros": 1768923259818923, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.818943) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 3000532, prev total WAL file size 3000532, number of live WAL files 2.
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.819751) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(1916KB)], [165(12MB)]
Jan 20 10:34:19 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259819800, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 14931690, "oldest_snapshot_seqno": -1}
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 10230 keys, 12957000 bytes, temperature: kUnknown
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260241841, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 12957000, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12891210, "index_size": 39117, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 270303, "raw_average_key_size": 26, "raw_value_size": 12712280, "raw_average_value_size": 1242, "num_data_blocks": 1487, "num_entries": 10230, "num_filter_entries": 10230, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923259, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.242257) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 12957000 bytes
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.401566) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 35.4 rd, 30.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.2) write-amplify(6.6) OK, records in: 10753, records dropped: 523 output_compression: NoCompression
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.401610) EVENT_LOG_v1 {"time_micros": 1768923260401593, "job": 106, "event": "compaction_finished", "compaction_time_micros": 422178, "compaction_time_cpu_micros": 31381, "output_level": 6, "num_output_files": 1, "total_output_size": 12957000, "num_input_records": 10753, "num_output_records": 10230, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260402393, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260405425, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:19.819659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.405558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.405567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.405570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.405572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:34:20.405575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:20.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:20.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.535 226890 DEBUG nova.compute.manager [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.535 226890 DEBUG oslo_concurrency.lockutils [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.536 226890 DEBUG oslo_concurrency.lockutils [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.536 226890 DEBUG oslo_concurrency.lockutils [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.536 226890 DEBUG nova.compute.manager [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] No waiting events found dispatching network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:20 np0005588920 nova_compute[226886]: 2026-01-20 15:34:20.536 226890 WARNING nova.compute.manager [req-3a6d69bb-7b83-4bb3-90dd-3f48928e6bf5 req-8d97c2a6-8c33-4598-bbe5-dbaf0cf61f35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received unexpected event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:34:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:22.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:22.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.364 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.387 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.388 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.389 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.389 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:23 np0005588920 nova_compute[226886]: 2026-01-20 15:34:23.390 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:24 np0005588920 nova_compute[226886]: 2026-01-20 15:34:24.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:24 np0005588920 nova_compute[226886]: 2026-01-20 15:34:24.181 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:24.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:24.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:24 np0005588920 NetworkManager[49076]: <info>  [1768923264.6619] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/448)
Jan 20 10:34:24 np0005588920 NetworkManager[49076]: <info>  [1768923264.6636] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/449)
Jan 20 10:34:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:24Z|00955|binding|INFO|Releasing lport b36be382-7937-4c5c-b0f7-fc4a6e68a050 from this chassis (sb_readonly=0)
Jan 20 10:34:24 np0005588920 nova_compute[226886]: 2026-01-20 15:34:24.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:24 np0005588920 nova_compute[226886]: 2026-01-20 15:34:24.695 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:24 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:24Z|00956|binding|INFO|Releasing lport b36be382-7937-4c5c-b0f7-fc4a6e68a050 from this chassis (sb_readonly=0)
Jan 20 10:34:24 np0005588920 nova_compute[226886]: 2026-01-20 15:34:24.699 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.754 226890 DEBUG nova.compute.manager [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.755 226890 DEBUG nova.compute.manager [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Refreshing instance network info cache due to event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.755 226890 DEBUG oslo_concurrency.lockutils [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.755 226890 DEBUG oslo_concurrency.lockutils [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:25 np0005588920 nova_compute[226886]: 2026-01-20 15:34:25.755 226890 DEBUG nova.network.neutron [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Refreshing network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.002 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.003 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.003 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.003 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.003 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.005 226890 INFO nova.compute.manager [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Terminating instance#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.005 226890 DEBUG nova.compute.manager [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:34:26 np0005588920 kernel: tap1dee9c67-fb (unregistering): left promiscuous mode
Jan 20 10:34:26 np0005588920 NetworkManager[49076]: <info>  [1768923266.0501] device (tap1dee9c67-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.091 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:26Z|00957|binding|INFO|Releasing lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 from this chassis (sb_readonly=0)
Jan 20 10:34:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:26Z|00958|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 down in Southbound
Jan 20 10:34:26 np0005588920 ovn_controller[133971]: 2026-01-20T15:34:26Z|00959|binding|INFO|Removing iface tap1dee9c67-fb ovn-installed in OVS
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.093 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.099 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:3a:75 10.100.0.12'], port_security=['fa:16:3e:ea:3a:75 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99dd5684-1685-443e-9373-f548d80784f6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d2010b-16ff-4152-8c6b-d6e8ffb1b3ca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=1dee9c67-fb01-4fcd-8f35-805a326ee235) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.100 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 1dee9c67-fb01-4fcd-8f35-805a326ee235 in datapath 99dd5684-1685-443e-9373-f548d80784f6 unbound from our chassis#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.101 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99dd5684-1685-443e-9373-f548d80784f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.102 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[24534284-79d3-4013-b2ce-1eb2aae09e41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.103 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 namespace which is not needed anymore#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.105 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d2.scope: Deactivated successfully.
Jan 20 10:34:26 np0005588920 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d2.scope: Consumed 8.963s CPU time.
Jan 20 10:34:26 np0005588920 systemd-machined[196121]: Machine qemu-98-instance-000000d2 terminated.
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.242 226890 INFO nova.virt.libvirt.driver [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Instance destroyed successfully.#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.242 226890 DEBUG nova.objects.instance [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:26 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [NOTICE]   (308439) : haproxy version is 2.8.14-c23fe91
Jan 20 10:34:26 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [NOTICE]   (308439) : path to executable is /usr/sbin/haproxy
Jan 20 10:34:26 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [WARNING]  (308439) : Exiting Master process...
Jan 20 10:34:26 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [ALERT]    (308439) : Current worker (308441) exited with code 143 (Terminated)
Jan 20 10:34:26 np0005588920 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[308435]: [WARNING]  (308439) : All workers exited. Exiting... (0)
Jan 20 10:34:26 np0005588920 systemd[1]: libpod-10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f.scope: Deactivated successfully.
Jan 20 10:34:26 np0005588920 podman[308473]: 2026-01-20 15:34:26.259854265 +0000 UTC m=+0.060887919 container died 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.262 226890 DEBUG nova.virt.libvirt.vif [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:34:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-385886117',display_name='tempest-TestNetworkBasicOps-server-385886117',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-385886117',id=210,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4egywmUgiowaHLTJn1nErdfaP0oMdhQmIgUXn8uXQUfQZSuklJDV6MWPtY7LPTEUS5qVJRUYQY1UGkGkHNZraRamn/90IKm/HwKOMLNLsEuUfRLsBa0wZUmsHoCCyjuQ==',key_name='tempest-TestNetworkBasicOps-841590218',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:34:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-a4liy8zm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:34:17Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.263 226890 DEBUG nova.network.os_vif_util [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.264 226890 DEBUG nova.network.os_vif_util [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.264 226890 DEBUG os_vif [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.266 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1dee9c67-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.269 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.272 226890 INFO os_vif [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb')#033[00m
Jan 20 10:34:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:34:26 np0005588920 systemd[1]: var-lib-containers-storage-overlay-bf80396d8a3c738552098e4cf208194b0ca28c3298ac8f1bb18aded9ee83566d-merged.mount: Deactivated successfully.
Jan 20 10:34:26 np0005588920 podman[308473]: 2026-01-20 15:34:26.339961408 +0000 UTC m=+0.140995062 container cleanup 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:34:26 np0005588920 systemd[1]: libpod-conmon-10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f.scope: Deactivated successfully.
Jan 20 10:34:26 np0005588920 podman[308530]: 2026-01-20 15:34:26.414494343 +0000 UTC m=+0.051713458 container remove 10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:34:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:26.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.420 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a91b0ab-aeeb-48f0-8aa0-06c819b84d4f]: (4, ('Tue Jan 20 03:34:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 (10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f)\n10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f\nTue Jan 20 03:34:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 (10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f)\n10a54a94f87011935c703f8ddfbb6a6272b3c600713eeee8bfc0a865bd36377f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.421 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[83b2fbcf-4371-4e85-ac72-09eb7b137c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.422 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99dd5684-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.424 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 kernel: tap99dd5684-10: left promiscuous mode
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.436 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.438 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a974cd32-1d93-424a-8528-f6215e41904e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.442 226890 DEBUG nova.compute.manager [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.443 226890 DEBUG oslo_concurrency.lockutils [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.443 226890 DEBUG oslo_concurrency.lockutils [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.443 226890 DEBUG oslo_concurrency.lockutils [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.443 226890 DEBUG nova.compute.manager [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] No waiting events found dispatching network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:26 np0005588920 nova_compute[226886]: 2026-01-20 15:34:26.444 226890 DEBUG nova.compute.manager [req-acdfb32b-cbef-45bd-ad96-6adcf7176ac4 req-2c29d552-24af-4c12-8582-292c0033ba22 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.459 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[fea2d22f-e259-4d0e-a98e-ba3b2de20942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.460 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7e08ca8c-30ae-41a3-a635-deee0e5ce31e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.476 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b55586-2324-4c03-b9f6-12e7ed021f55]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829727, 'reachable_time': 34362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308543, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.481 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:34:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:26.481 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[42af695b-e063-45a2-8422-d0a67d4daff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:26 np0005588920 systemd[1]: run-netns-ovnmeta\x2d99dd5684\x2d1685\x2d443e\x2d9373\x2df548d80784f6.mount: Deactivated successfully.
Jan 20 10:34:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:26.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:27 np0005588920 nova_compute[226886]: 2026-01-20 15:34:27.822 226890 DEBUG nova.network.neutron [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updated VIF entry in instance network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:34:27 np0005588920 nova_compute[226886]: 2026-01-20 15:34:27.822 226890 DEBUG nova.network.neutron [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:27 np0005588920 nova_compute[226886]: 2026-01-20 15:34:27.847 226890 DEBUG oslo_concurrency.lockutils [req-e6f76a77-e86e-462e-b6d6-7a134e5ab59a req-bd211b87-084f-4022-bf13-cc27aaa0be5a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.250 226890 INFO nova.virt.libvirt.driver [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Deleting instance files /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_del#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.251 226890 INFO nova.virt.libvirt.driver [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Deletion of /var/lib/nova/instances/bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf_del complete#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.312 226890 INFO nova.compute.manager [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Took 2.31 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.312 226890 DEBUG oslo.service.loopingcall [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.312 226890 DEBUG nova.compute.manager [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.313 226890 DEBUG nova.network.neutron [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:34:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:28.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:28.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.535 226890 DEBUG nova.compute.manager [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.535 226890 DEBUG oslo_concurrency.lockutils [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.535 226890 DEBUG oslo_concurrency.lockutils [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.536 226890 DEBUG oslo_concurrency.lockutils [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.536 226890 DEBUG nova.compute.manager [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] No waiting events found dispatching network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:28 np0005588920 nova_compute[226886]: 2026-01-20 15:34:28.536 226890 WARNING nova.compute.manager [req-b4313b96-a96f-4f0f-9553-ab0717b6845c req-2b15de9d-b8a7-4319-86e7-068acd6835d9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Received unexpected event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:34:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:29 np0005588920 nova_compute[226886]: 2026-01-20 15:34:29.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:30.216 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:30 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:30.217 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:34:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:30.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.423 226890 DEBUG nova.network.neutron [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.443 226890 INFO nova.compute.manager [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Took 2.13 seconds to deallocate network for instance.#033[00m
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.494 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.494 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:30.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.554 226890 DEBUG oslo_concurrency.processutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/22013695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.985 226890 DEBUG oslo_concurrency.processutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:30 np0005588920 nova_compute[226886]: 2026-01-20 15:34:30.991 226890 DEBUG nova.compute.provider_tree [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.015 226890 DEBUG nova.scheduler.client.report [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.037 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.075 226890 INFO nova.scheduler.client.report [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.182 226890 DEBUG oslo_concurrency.lockutils [None req-93aada35-fc79-44b3-ab97-ed1a1181174b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.267 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:31 np0005588920 nova_compute[226886]: 2026-01-20 15:34:31.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:32 np0005588920 podman[308570]: 2026-01-20 15:34:32.001748735 +0000 UTC m=+0.088494052 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 10:34:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:32.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:32.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:32 np0005588920 nova_compute[226886]: 2026-01-20 15:34:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:32 np0005588920 nova_compute[226886]: 2026-01-20 15:34:32.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:34:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:34 np0005588920 nova_compute[226886]: 2026-01-20 15:34:34.186 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:34.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:34.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:36 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:34:36.218 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:36 np0005588920 nova_compute[226886]: 2026-01-20 15:34:36.286 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:36.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:36.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:37 np0005588920 nova_compute[226886]: 2026-01-20 15:34:37.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:38.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:38.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:39 np0005588920 nova_compute[226886]: 2026-01-20 15:34:39.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:39 np0005588920 podman[308596]: 2026-01-20 15:34:39.949934529 +0000 UTC m=+0.042592040 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 20 10:34:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:40.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:40.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:41 np0005588920 nova_compute[226886]: 2026-01-20 15:34:41.240 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923266.2390018, bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:41 np0005588920 nova_compute[226886]: 2026-01-20 15:34:41.241 226890 INFO nova.compute.manager [-] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:34:41 np0005588920 nova_compute[226886]: 2026-01-20 15:34:41.350 226890 DEBUG nova.compute.manager [None req-9ceb876f-952f-47bb-9dcb-3ac9530534fa - - - - - -] [instance: bfdde0ba-77cb-47b2-9ca0-4d8f0398beaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:41 np0005588920 nova_compute[226886]: 2026-01-20 15:34:41.351 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:42.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:42.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:44 np0005588920 nova_compute[226886]: 2026-01-20 15:34:44.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:44.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:44.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:46 np0005588920 nova_compute[226886]: 2026-01-20 15:34:46.352 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:46.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:46.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:47 np0005588920 nova_compute[226886]: 2026-01-20 15:34:47.739 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:47 np0005588920 nova_compute[226886]: 2026-01-20 15:34:47.739 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:34:47 np0005588920 nova_compute[226886]: 2026-01-20 15:34:47.756 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:34:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:48.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:48.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:49 np0005588920 nova_compute[226886]: 2026-01-20 15:34:49.193 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:50.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:50.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:51 np0005588920 nova_compute[226886]: 2026-01-20 15:34:51.355 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:52.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:52.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:53 np0005588920 nova_compute[226886]: 2026-01-20 15:34:53.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:53 np0005588920 nova_compute[226886]: 2026-01-20 15:34:53.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:34:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:54 np0005588920 nova_compute[226886]: 2026-01-20 15:34:54.195 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:54.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:54.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:56 np0005588920 nova_compute[226886]: 2026-01-20 15:34:56.400 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:56.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:56.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:34:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:58.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:59 np0005588920 nova_compute[226886]: 2026-01-20 15:34:59.198 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:00.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:01 np0005588920 nova_compute[226886]: 2026-01-20 15:35:01.404 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:02.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:02.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:02 np0005588920 podman[308617]: 2026-01-20 15:35:02.984589578 +0000 UTC m=+0.076583013 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 20 10:35:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:04 np0005588920 nova_compute[226886]: 2026-01-20 15:35:04.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:04.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:04.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:35:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:35:06 np0005588920 nova_compute[226886]: 2026-01-20 15:35:06.407 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:06.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:08.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:08.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:09 np0005588920 nova_compute[226886]: 2026-01-20 15:35:09.201 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:10.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:10 np0005588920 nova_compute[226886]: 2026-01-20 15:35:10.526 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:10.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:10 np0005588920 nova_compute[226886]: 2026-01-20 15:35:10.601 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588920 podman[308777]: 2026-01-20 15:35:10.964000208 +0000 UTC m=+0.054524058 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 20 10:35:11 np0005588920 nova_compute[226886]: 2026-01-20 15:35:11.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:12.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:12.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:14.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:35:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:14.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.749 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.777 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.778 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.778 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:35:14 np0005588920 nova_compute[226886]: 2026-01-20 15:35:14.778 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:35:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/858350277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.213 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.368 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.370 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4148MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.370 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.370 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.439 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.439 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.513 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:35:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1905477531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.971 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:15 np0005588920 nova_compute[226886]: 2026-01-20 15:35:15.978 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:35:16 np0005588920 nova_compute[226886]: 2026-01-20 15:35:16.000 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:35:16 np0005588920 nova_compute[226886]: 2026-01-20 15:35:16.036 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:35:16 np0005588920 nova_compute[226886]: 2026-01-20 15:35:16.036 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:16 np0005588920 nova_compute[226886]: 2026-01-20 15:35:16.418 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:16.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:16.499 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:16.500 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:16.501 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:16.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:18.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:18.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:19 np0005588920 nova_compute[226886]: 2026-01-20 15:35:19.206 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:20.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.014 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.014 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.015 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.032 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.423 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:21 np0005588920 nova_compute[226886]: 2026-01-20 15:35:21.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:22.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:22.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:22 np0005588920 nova_compute[226886]: 2026-01-20 15:35:22.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:23 np0005588920 nova_compute[226886]: 2026-01-20 15:35:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:24 np0005588920 nova_compute[226886]: 2026-01-20 15:35:24.208 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:24.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:24.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:24 np0005588920 nova_compute[226886]: 2026-01-20 15:35:24.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:25 np0005588920 nova_compute[226886]: 2026-01-20 15:35:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:26 np0005588920 nova_compute[226886]: 2026-01-20 15:35:26.427 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:26.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:26.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:28.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:28.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:29 np0005588920 nova_compute[226886]: 2026-01-20 15:35:29.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:29 np0005588920 nova_compute[226886]: 2026-01-20 15:35:29.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:30.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:30.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:31 np0005588920 nova_compute[226886]: 2026-01-20 15:35:31.431 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:32.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:32.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:32 np0005588920 nova_compute[226886]: 2026-01-20 15:35:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:32 np0005588920 nova_compute[226886]: 2026-01-20 15:35:32.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:35:33 np0005588920 nova_compute[226886]: 2026-01-20 15:35:33.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:33 np0005588920 podman[308892]: 2026-01-20 15:35:33.830437664 +0000 UTC m=+0.074041001 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:35:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:34 np0005588920 nova_compute[226886]: 2026-01-20 15:35:34.212 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:34.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:34.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:36 np0005588920 nova_compute[226886]: 2026-01-20 15:35:36.434 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:36.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:36.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.212 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.213 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.230 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.332 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.333 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.340 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.341 226890 INFO nova.compute.claims [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.445 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:35:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2682010573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.884 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.889 226890 DEBUG nova.compute.provider_tree [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.908 226890 DEBUG nova.scheduler.client.report [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.934 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:37 np0005588920 nova_compute[226886]: 2026-01-20 15:35:37.935 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.027 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.027 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.051 226890 INFO nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.087 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.198 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.199 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.200 226890 INFO nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Creating image(s)#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.226 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.256 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.284 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.287 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.360 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.361 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.361 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.362 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.410 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.413 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 98535402-ae8c-46cb-bfbc-6011e34adc25_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:38.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:38.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:38 np0005588920 nova_compute[226886]: 2026-01-20 15:35:38.783 226890 DEBUG nova.policy [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.027 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 98535402-ae8c-46cb-bfbc-6011e34adc25_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.614s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.103 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.552 226890 DEBUG nova.objects.instance [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 98535402-ae8c-46cb-bfbc-6011e34adc25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.576 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.576 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Ensure instance console log exists: /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.577 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.577 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:39 np0005588920 nova_compute[226886]: 2026-01-20 15:35:39.578 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:40 np0005588920 nova_compute[226886]: 2026-01-20 15:35:40.282 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Successfully created port: 23586d88-de8c-4135-b0a0-1a967326da06 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:35:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:40.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:40.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:40 np0005588920 nova_compute[226886]: 2026-01-20 15:35:40.974 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Successfully updated port: 23586d88-de8c-4135-b0a0-1a967326da06 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:35:40 np0005588920 nova_compute[226886]: 2026-01-20 15:35:40.996 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:35:40 np0005588920 nova_compute[226886]: 2026-01-20 15:35:40.996 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:35:40 np0005588920 nova_compute[226886]: 2026-01-20 15:35:40.996 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:35:41 np0005588920 nova_compute[226886]: 2026-01-20 15:35:41.076 226890 DEBUG nova.compute.manager [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-changed-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:35:41 np0005588920 nova_compute[226886]: 2026-01-20 15:35:41.076 226890 DEBUG nova.compute.manager [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing instance network info cache due to event network-changed-23586d88-de8c-4135-b0a0-1a967326da06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:35:41 np0005588920 nova_compute[226886]: 2026-01-20 15:35:41.077 226890 DEBUG oslo_concurrency.lockutils [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:35:41 np0005588920 nova_compute[226886]: 2026-01-20 15:35:41.439 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:41 np0005588920 nova_compute[226886]: 2026-01-20 15:35:41.746 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:35:41 np0005588920 podman[309108]: 2026-01-20 15:35:41.962250539 +0000 UTC m=+0.051268646 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:35:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:42.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:42.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.814 226890 DEBUG nova.network.neutron [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updating instance_info_cache with network_info: [{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.840 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.840 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Instance network_info: |[{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.841 226890 DEBUG oslo_concurrency.lockutils [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.841 226890 DEBUG nova.network.neutron [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.844 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Start _get_guest_xml network_info=[{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.847 226890 WARNING nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.853 226890 DEBUG nova.virt.libvirt.host [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.853 226890 DEBUG nova.virt.libvirt.host [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.856 226890 DEBUG nova.virt.libvirt.host [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.857 226890 DEBUG nova.virt.libvirt.host [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.858 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.858 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.859 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.859 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.859 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.859 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.860 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.860 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.860 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.860 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.860 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.861 226890 DEBUG nova.virt.hardware [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:35:43 np0005588920 nova_compute[226886]: 2026-01-20 15:35:43.863 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:35:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3378826245' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.287 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.309 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.312 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:44.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:44.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:35:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3772005823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.732 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.733 226890 DEBUG nova.virt.libvirt.vif [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1804450175',display_name='tempest-TestNetworkBasicOps-server-1804450175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1804450175',id=212,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGRX5fexjKokkqYVj7oYMeVR7UpAmEjzQ0Da/LKxdp1Jn1nuSpc0DyYWAA1zZBYz5eAdoUdLB2IX5GQ1ZxTpArlXLJtT+sN4hs4XZ912Qon8Z3FCLdE6+x3CUSzZ/IqClA==',key_name='tempest-TestNetworkBasicOps-662885633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ds3vvvoa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:35:38Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=98535402-ae8c-46cb-bfbc-6011e34adc25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.734 226890 DEBUG nova.network.os_vif_util [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.735 226890 DEBUG nova.network.os_vif_util [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.736 226890 DEBUG nova.objects.instance [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 98535402-ae8c-46cb-bfbc-6011e34adc25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.772 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <uuid>98535402-ae8c-46cb-bfbc-6011e34adc25</uuid>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <name>instance-000000d4</name>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkBasicOps-server-1804450175</nova:name>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:35:43</nova:creationTime>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <nova:port uuid="23586d88-de8c-4135-b0a0-1a967326da06">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="serial">98535402-ae8c-46cb-bfbc-6011e34adc25</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="uuid">98535402-ae8c-46cb-bfbc-6011e34adc25</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/98535402-ae8c-46cb-bfbc-6011e34adc25_disk">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:29:04:15"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <target dev="tap23586d88-de"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/console.log" append="off"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:35:44 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:35:44 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:35:44 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:35:44 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.774 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Preparing to wait for external event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.774 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.774 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.775 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.775 226890 DEBUG nova.virt.libvirt.vif [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1804450175',display_name='tempest-TestNetworkBasicOps-server-1804450175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1804450175',id=212,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGRX5fexjKokkqYVj7oYMeVR7UpAmEjzQ0Da/LKxdp1Jn1nuSpc0DyYWAA1zZBYz5eAdoUdLB2IX5GQ1ZxTpArlXLJtT+sN4hs4XZ912Qon8Z3FCLdE6+x3CUSzZ/IqClA==',key_name='tempest-TestNetworkBasicOps-662885633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ds3vvvoa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:35:38Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=98535402-ae8c-46cb-bfbc-6011e34adc25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.776 226890 DEBUG nova.network.os_vif_util [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.777 226890 DEBUG nova.network.os_vif_util [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.777 226890 DEBUG os_vif [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.778 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.778 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.779 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.782 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23586d88-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.782 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23586d88-de, col_values=(('external_ids', {'iface-id': '23586d88-de8c-4135-b0a0-1a967326da06', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:04:15', 'vm-uuid': '98535402-ae8c-46cb-bfbc-6011e34adc25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588920 NetworkManager[49076]: <info>  [1768923344.7853] manager: (tap23586d88-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/450)
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.786 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.790 226890 INFO os_vif [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de')#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.843 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.844 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.844 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:29:04:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.844 226890 INFO nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Using config drive#033[00m
Jan 20 10:35:44 np0005588920 nova_compute[226886]: 2026-01-20 15:35:44.866 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.091 226890 DEBUG nova.network.neutron [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updated VIF entry in instance network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.091 226890 DEBUG nova.network.neutron [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updating instance_info_cache with network_info: [{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.110 226890 DEBUG oslo_concurrency.lockutils [req-c5ed9390-7db1-4606-86df-0e78308381c7 req-b491f347-c04f-4e7a-bac1-e6505c5458c8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.269 226890 INFO nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Creating config drive at /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.274 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp70her4ic execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.407 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp70her4ic" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.433 226890 DEBUG nova.storage.rbd_utils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.436 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config 98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.583 226890 DEBUG oslo_concurrency.processutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config 98535402-ae8c-46cb-bfbc-6011e34adc25_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.584 226890 INFO nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Deleting local config drive /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25/disk.config because it was imported into RBD.#033[00m
Jan 20 10:35:45 np0005588920 virtqemud[226436]: End of file while reading data: Input/output error
Jan 20 10:35:45 np0005588920 virtqemud[226436]: End of file while reading data: Input/output error
Jan 20 10:35:45 np0005588920 kernel: tap23586d88-de: entered promiscuous mode
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.6343] manager: (tap23586d88-de): new Tun device (/org/freedesktop/NetworkManager/Devices/451)
Jan 20 10:35:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:45Z|00960|binding|INFO|Claiming lport 23586d88-de8c-4135-b0a0-1a967326da06 for this chassis.
Jan 20 10:35:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:45Z|00961|binding|INFO|23586d88-de8c-4135-b0a0-1a967326da06: Claiming fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.635 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.648 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:04:15 10.100.0.8'], port_security=['fa:16:3e:29:04:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '98535402-ae8c-46cb-bfbc-6011e34adc25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-094fc5a0-d658-4e68-b305-244ff77454a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c1da830-14cd-4941-ab68-988ee83acbc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c34c0b90-e1fd-429a-8077-3d0666ec2c89, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=23586d88-de8c-4135-b0a0-1a967326da06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.649 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 23586d88-de8c-4135-b0a0-1a967326da06 in datapath 094fc5a0-d658-4e68-b305-244ff77454a4 bound to our chassis#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.650 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 094fc5a0-d658-4e68-b305-244ff77454a4#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.663 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0568d64d-80bc-424d-a3fa-de5543266eff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.664 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap094fc5a0-d1 in ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:35:45 np0005588920 systemd-machined[196121]: New machine qemu-99-instance-000000d4.
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.665 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap094fc5a0-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.666 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2773702f-8296-4693-ad66-0d6d8d01b7b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 systemd-udevd[309262]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.668 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2375187f-bef9-467c-afac-156486af6433]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.6791] device (tap23586d88-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.6799] device (tap23586d88-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.678 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[70c04d19-de97-4ead-a484-a5f26ebb5370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.699 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:45Z|00962|binding|INFO|Setting lport 23586d88-de8c-4135-b0a0-1a967326da06 ovn-installed in OVS
Jan 20 10:35:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:45Z|00963|binding|INFO|Setting lport 23586d88-de8c-4135-b0a0-1a967326da06 up in Southbound
Jan 20 10:35:45 np0005588920 systemd[1]: Started Virtual Machine qemu-99-instance-000000d4.
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.706 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.705 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa9a40b-bea8-4f28-82ff-64375771ef2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.737 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[cad5b247-0c13-4cf5-95db-0ef42789e376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.742 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1e2507-844d-4a8d-84df-961bfe6d53bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.7436] manager: (tap094fc5a0-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/452)
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.773 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8bea64b9-1b49-4123-98dd-e68edbfe49ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.776 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ec104b-bb29-44de-927b-c6a6c6107307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.7974] device (tap094fc5a0-d0): carrier: link connected
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.802 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[d53594ef-e53c-4c13-be6f-ed860b312a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.818 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[37b72134-a63f-4a4b-a2e8-18c90fbb7c1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap094fc5a0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:f3:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 301], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838748, 'reachable_time': 27592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309296, 'error': None, 'target': 'ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.830 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[82a28de7-9079-4658-b5ee-80aa8662b70b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:f3e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 838748, 'tstamp': 838748}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309297, 'error': None, 'target': 'ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.845 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[65e4cd9b-82f5-4862-be77-60d5406aec26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap094fc5a0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:f3:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 301], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838748, 'reachable_time': 27592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309298, 'error': None, 'target': 'ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.877 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a3494909-394e-4ee6-8d62-3e2eda9f5da3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.941 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f156c41a-7a5b-4fad-9ded-b51d7cfca6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.942 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap094fc5a0-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.942 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.943 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap094fc5a0-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.945 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 NetworkManager[49076]: <info>  [1768923345.9458] manager: (tap094fc5a0-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/453)
Jan 20 10:35:45 np0005588920 kernel: tap094fc5a0-d0: entered promiscuous mode
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.948 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.949 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap094fc5a0-d0, col_values=(('external_ids', {'iface-id': '6293a5ec-22f0-43da-ab7b-03b7f61f6313'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.950 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:45Z|00964|binding|INFO|Releasing lport 6293a5ec-22f0-43da-ab7b-03b7f61f6313 from this chassis (sb_readonly=0)
Jan 20 10:35:45 np0005588920 nova_compute[226886]: 2026-01-20 15:35:45.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.965 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/094fc5a0-d658-4e68-b305-244ff77454a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/094fc5a0-d658-4e68-b305-244ff77454a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.966 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[419f1dc7-580e-48f7-ae73-28bde0aad966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.966 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-094fc5a0-d658-4e68-b305-244ff77454a4
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/094fc5a0-d658-4e68-b305-244ff77454a4.pid.haproxy
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 094fc5a0-d658-4e68-b305-244ff77454a4
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:35:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:45.967 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4', 'env', 'PROCESS_TAG=haproxy-094fc5a0-d658-4e68-b305-244ff77454a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/094fc5a0-d658-4e68-b305-244ff77454a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:35:46 np0005588920 podman[309330]: 2026-01-20 15:35:46.326124827 +0000 UTC m=+0.050022621 container create d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 10:35:46 np0005588920 systemd[1]: Started libpod-conmon-d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0.scope.
Jan 20 10:35:46 np0005588920 podman[309330]: 2026-01-20 15:35:46.301334183 +0000 UTC m=+0.025231997 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:35:46 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:35:46 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/922180ae746749d245930194d1e2facdc7fa2fd1efef422be623d8d462d75092/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:35:46 np0005588920 podman[309330]: 2026-01-20 15:35:46.415571305 +0000 UTC m=+0.139469119 container init d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:35:46 np0005588920 podman[309330]: 2026-01-20 15:35:46.420516185 +0000 UTC m=+0.144413979 container start d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:35:46 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [NOTICE]   (309350) : New worker (309352) forked
Jan 20 10:35:46 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [NOTICE]   (309350) : Loading success.
Jan 20 10:35:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:46.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:46.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.699 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923346.6992188, 98535402-ae8c-46cb-bfbc-6011e34adc25 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.700 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] VM Started (Lifecycle Event)#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.742 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.746 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923346.700271, 98535402-ae8c-46cb-bfbc-6011e34adc25 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.746 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.977 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:35:46 np0005588920 nova_compute[226886]: 2026-01-20 15:35:46.981 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.001 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.691 226890 DEBUG nova.compute.manager [req-cbd21dd6-64b7-42ec-b9fd-be9ac80d66ad req-2aab5f35-88cf-4d2f-b81e-c741df8ab440 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.691 226890 DEBUG oslo_concurrency.lockutils [req-cbd21dd6-64b7-42ec-b9fd-be9ac80d66ad req-2aab5f35-88cf-4d2f-b81e-c741df8ab440 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.692 226890 DEBUG oslo_concurrency.lockutils [req-cbd21dd6-64b7-42ec-b9fd-be9ac80d66ad req-2aab5f35-88cf-4d2f-b81e-c741df8ab440 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.692 226890 DEBUG oslo_concurrency.lockutils [req-cbd21dd6-64b7-42ec-b9fd-be9ac80d66ad req-2aab5f35-88cf-4d2f-b81e-c741df8ab440 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.692 226890 DEBUG nova.compute.manager [req-cbd21dd6-64b7-42ec-b9fd-be9ac80d66ad req-2aab5f35-88cf-4d2f-b81e-c741df8ab440 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Processing event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.693 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.695 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923347.6956258, 98535402-ae8c-46cb-bfbc-6011e34adc25 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.696 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.697 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.700 226890 INFO nova.virt.libvirt.driver [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Instance spawned successfully.#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.700 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.724 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.727 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.728 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.728 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.729 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.729 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.730 226890 DEBUG nova.virt.libvirt.driver [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.733 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.795 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.834 226890 INFO nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Took 9.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.835 226890 DEBUG nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.933 226890 INFO nova.compute.manager [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Took 10.63 seconds to build instance.#033[00m
Jan 20 10:35:47 np0005588920 nova_compute[226886]: 2026-01-20 15:35:47.956 226890 DEBUG oslo_concurrency.lockutils [None req-2fa335de-b2d6-4688-b3f6-184b12b5418b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:48.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:48.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:48 np0005588920 nova_compute[226886]: 2026-01-20 15:35:48.747 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:48.747 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:35:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:48.748 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:35:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.243 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.883 226890 DEBUG nova.compute.manager [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.884 226890 DEBUG oslo_concurrency.lockutils [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.884 226890 DEBUG oslo_concurrency.lockutils [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.885 226890 DEBUG oslo_concurrency.lockutils [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.885 226890 DEBUG nova.compute.manager [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] No waiting events found dispatching network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:35:49 np0005588920 nova_compute[226886]: 2026-01-20 15:35:49.885 226890 WARNING nova.compute.manager [req-e35e72a7-a974-4a79-b233-05f02e2bb452 req-8750cb5e-6219-47b0-b6b9-bbedfca896f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received unexpected event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:35:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:50.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:50.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:50 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:35:50.749 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:52.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:52.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:53 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:53Z|00965|binding|INFO|Releasing lport 6293a5ec-22f0-43da-ab7b-03b7f61f6313 from this chassis (sb_readonly=0)
Jan 20 10:35:53 np0005588920 nova_compute[226886]: 2026-01-20 15:35:53.758 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:53 np0005588920 NetworkManager[49076]: <info>  [1768923353.7590] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/454)
Jan 20 10:35:53 np0005588920 NetworkManager[49076]: <info>  [1768923353.7601] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/455)
Jan 20 10:35:53 np0005588920 ovn_controller[133971]: 2026-01-20T15:35:53Z|00966|binding|INFO|Releasing lport 6293a5ec-22f0-43da-ab7b-03b7f61f6313 from this chassis (sb_readonly=0)
Jan 20 10:35:53 np0005588920 nova_compute[226886]: 2026-01-20 15:35:53.792 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:53 np0005588920 nova_compute[226886]: 2026-01-20 15:35:53.797 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.126 226890 DEBUG nova.compute.manager [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-changed-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.127 226890 DEBUG nova.compute.manager [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing instance network info cache due to event network-changed-23586d88-de8c-4135-b0a0-1a967326da06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.127 226890 DEBUG oslo_concurrency.lockutils [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.127 226890 DEBUG oslo_concurrency.lockutils [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.127 226890 DEBUG nova.network.neutron [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.245 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:54.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:54.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:54 np0005588920 nova_compute[226886]: 2026-01-20 15:35:54.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:55 np0005588920 nova_compute[226886]: 2026-01-20 15:35:55.767 226890 DEBUG nova.network.neutron [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updated VIF entry in instance network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:35:55 np0005588920 nova_compute[226886]: 2026-01-20 15:35:55.767 226890 DEBUG nova.network.neutron [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updating instance_info_cache with network_info: [{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:35:55 np0005588920 nova_compute[226886]: 2026-01-20 15:35:55.798 226890 DEBUG oslo_concurrency.lockutils [req-d92c5356-f6c7-4405-89df-960ddfa0ff0f req-3e4215a5-e8af-4106-8dc0-edbd2ba26a19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:35:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:56.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:56.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:58.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:35:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:58.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:59 np0005588920 nova_compute[226886]: 2026-01-20 15:35:59.247 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:59 np0005588920 nova_compute[226886]: 2026-01-20 15:35:59.789 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:00.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:00.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:01Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:36:01 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:01Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:36:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:02.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:02.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:03 np0005588920 podman[309406]: 2026-01-20 15:36:03.984965646 +0000 UTC m=+0.075837683 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:36:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:04 np0005588920 nova_compute[226886]: 2026-01-20 15:36:04.249 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:04.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:04 np0005588920 nova_compute[226886]: 2026-01-20 15:36:04.790 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:06.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:08.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:08.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:08 np0005588920 nova_compute[226886]: 2026-01-20 15:36:08.908 226890 INFO nova.compute.manager [None req-18614846-80ac-441b-a124-dcbdddab4169 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Get console output#033[00m
Jan 20 10:36:08 np0005588920 nova_compute[226886]: 2026-01-20 15:36:08.912 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:36:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:09 np0005588920 nova_compute[226886]: 2026-01-20 15:36:09.251 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:09 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:09Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:36:09 np0005588920 nova_compute[226886]: 2026-01-20 15:36:09.829 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:10.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:10.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:11 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:11Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:36:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:36:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:36:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:12.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:12.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:12 np0005588920 podman[309565]: 2026-01-20 15:36:12.958927338 +0000 UTC m=+0.046320425 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 10:36:13 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:13Z|00136|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:04:15 10.100.0.8
Jan 20 10:36:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.165 226890 DEBUG nova.compute.manager [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-changed-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.166 226890 DEBUG nova.compute.manager [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing instance network info cache due to event network-changed-23586d88-de8c-4135-b0a0-1a967326da06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.166 226890 DEBUG oslo_concurrency.lockutils [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.167 226890 DEBUG oslo_concurrency.lockutils [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.167 226890 DEBUG nova.network.neutron [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Refreshing network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.282 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.282 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.283 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.283 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.283 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.284 226890 INFO nova.compute.manager [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Terminating instance#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.285 226890 DEBUG nova.compute.manager [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:36:14 np0005588920 kernel: tap23586d88-de (unregistering): left promiscuous mode
Jan 20 10:36:14 np0005588920 NetworkManager[49076]: <info>  [1768923374.3338] device (tap23586d88-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.342 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:14Z|00967|binding|INFO|Releasing lport 23586d88-de8c-4135-b0a0-1a967326da06 from this chassis (sb_readonly=0)
Jan 20 10:36:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:14Z|00968|binding|INFO|Setting lport 23586d88-de8c-4135-b0a0-1a967326da06 down in Southbound
Jan 20 10:36:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:14Z|00969|binding|INFO|Removing iface tap23586d88-de ovn-installed in OVS
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.353 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:04:15 10.100.0.8'], port_security=['fa:16:3e:29:04:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '98535402-ae8c-46cb-bfbc-6011e34adc25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-094fc5a0-d658-4e68-b305-244ff77454a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c1da830-14cd-4941-ab68-988ee83acbc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c34c0b90-e1fd-429a-8077-3d0666ec2c89, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=23586d88-de8c-4135-b0a0-1a967326da06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.355 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 23586d88-de8c-4135-b0a0-1a967326da06 in datapath 094fc5a0-d658-4e68-b305-244ff77454a4 unbound from our chassis#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.356 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 094fc5a0-d658-4e68-b305-244ff77454a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.357 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9db51a55-000f-4ef4-97bc-52803ae24e4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.358 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4 namespace which is not needed anymore#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.361 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d4.scope: Deactivated successfully.
Jan 20 10:36:14 np0005588920 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d4.scope: Consumed 14.278s CPU time.
Jan 20 10:36:14 np0005588920 systemd-machined[196121]: Machine qemu-99-instance-000000d4 terminated.
Jan 20 10:36:14 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [NOTICE]   (309350) : haproxy version is 2.8.14-c23fe91
Jan 20 10:36:14 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [NOTICE]   (309350) : path to executable is /usr/sbin/haproxy
Jan 20 10:36:14 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [WARNING]  (309350) : Exiting Master process...
Jan 20 10:36:14 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [ALERT]    (309350) : Current worker (309352) exited with code 143 (Terminated)
Jan 20 10:36:14 np0005588920 neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4[309346]: [WARNING]  (309350) : All workers exited. Exiting... (0)
Jan 20 10:36:14 np0005588920 systemd[1]: libpod-d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0.scope: Deactivated successfully.
Jan 20 10:36:14 np0005588920 podman[309606]: 2026-01-20 15:36:14.482407658 +0000 UTC m=+0.042034124 container died d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:36:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0-userdata-shm.mount: Deactivated successfully.
Jan 20 10:36:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-922180ae746749d245930194d1e2facdc7fa2fd1efef422be623d8d462d75092-merged.mount: Deactivated successfully.
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.515 226890 INFO nova.virt.libvirt.driver [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Instance destroyed successfully.#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.516 226890 DEBUG nova.objects.instance [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 98535402-ae8c-46cb-bfbc-6011e34adc25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:36:14 np0005588920 podman[309606]: 2026-01-20 15:36:14.523256477 +0000 UTC m=+0.082882943 container cleanup d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:36:14 np0005588920 systemd[1]: libpod-conmon-d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0.scope: Deactivated successfully.
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.530 226890 DEBUG nova.virt.libvirt.vif [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1804450175',display_name='tempest-TestNetworkBasicOps-server-1804450175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1804450175',id=212,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGRX5fexjKokkqYVj7oYMeVR7UpAmEjzQ0Da/LKxdp1Jn1nuSpc0DyYWAA1zZBYz5eAdoUdLB2IX5GQ1ZxTpArlXLJtT+sN4hs4XZ912Qon8Z3FCLdE6+x3CUSzZ/IqClA==',key_name='tempest-TestNetworkBasicOps-662885633',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:35:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ds3vvvoa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:35:47Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=98535402-ae8c-46cb-bfbc-6011e34adc25,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.531 226890 DEBUG nova.network.os_vif_util [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.533 226890 DEBUG nova.network.os_vif_util [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.534 226890 DEBUG os_vif [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.537 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.538 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23586d88-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.541 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.543 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.545 226890 INFO os_vif [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:04:15,bridge_name='br-int',has_traffic_filtering=True,id=23586d88-de8c-4135-b0a0-1a967326da06,network=Network(094fc5a0-d658-4e68-b305-244ff77454a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23586d88-de')#033[00m
Jan 20 10:36:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:14.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:14 np0005588920 podman[309648]: 2026-01-20 15:36:14.589351742 +0000 UTC m=+0.043254708 container remove d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.595 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8db843-b9c2-4f73-9337-e30886dd05e4]: (4, ('Tue Jan 20 03:36:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4 (d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0)\nd436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0\nTue Jan 20 03:36:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4 (d436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0)\nd436465a9fcc1711544a1dfc41a9816a45ee22662a6bbf50847e4cc91d34cdb0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.596 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1d61ebfe-afbd-4605-9498-0ed16cadd67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.597 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap094fc5a0-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 kernel: tap094fc5a0-d0: left promiscuous mode
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.611 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.613 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2d5bac-a3df-4f85-8b1c-ca7bdd5c8806]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.633 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d965d29e-513f-4f7b-947d-33a70a84a289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.635 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4d16db-618c-40fa-9621-73fcd8c87bc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.652 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[76e596f7-ffe5-4ad9-8f37-564ef9981418]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 838741, 'reachable_time': 40866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309678, 'error': None, 'target': 'ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.655 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-094fc5a0-d658-4e68-b305-244ff77454a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:36:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:14.655 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[6e63366a-d192-4681-baa1-40068adbde3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2d094fc5a0\x2dd658\x2d4e68\x2db305\x2d244ff77454a4.mount: Deactivated successfully.
Jan 20 10:36:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:14.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.920 226890 INFO nova.virt.libvirt.driver [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Deleting instance files /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25_del#033[00m
Jan 20 10:36:14 np0005588920 nova_compute[226886]: 2026-01-20 15:36:14.921 226890 INFO nova.virt.libvirt.driver [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Deletion of /var/lib/nova/instances/98535402-ae8c-46cb-bfbc-6011e34adc25_del complete#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.010 226890 INFO nova.compute.manager [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.011 226890 DEBUG oslo.service.loopingcall [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.011 226890 DEBUG nova.compute.manager [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.011 226890 DEBUG nova.network.neutron [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.038 226890 DEBUG nova.compute.manager [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-unplugged-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.038 226890 DEBUG oslo_concurrency.lockutils [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.039 226890 DEBUG oslo_concurrency.lockutils [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.039 226890 DEBUG oslo_concurrency.lockutils [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.039 226890 DEBUG nova.compute.manager [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] No waiting events found dispatching network-vif-unplugged-23586d88-de8c-4135-b0a0-1a967326da06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:36:15 np0005588920 nova_compute[226886]: 2026-01-20 15:36:15.039 226890 DEBUG nova.compute.manager [req-bffb11e9-6f7d-48de-b531-63be00050e52 req-9cc5d7f2-5f27-43ce-b106-dbd9ab21a4fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-unplugged-23586d88-de8c-4135-b0a0-1a967326da06 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.039 226890 DEBUG nova.network.neutron [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updated VIF entry in instance network info cache for port 23586d88-de8c-4135-b0a0-1a967326da06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.039 226890 DEBUG nova.network.neutron [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updating instance_info_cache with network_info: [{"id": "23586d88-de8c-4135-b0a0-1a967326da06", "address": "fa:16:3e:29:04:15", "network": {"id": "094fc5a0-d658-4e68-b305-244ff77454a4", "bridge": "br-int", "label": "tempest-network-smoke--1321578468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23586d88-de", "ovs_interfaceid": "23586d88-de8c-4135-b0a0-1a967326da06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.060 226890 DEBUG oslo_concurrency.lockutils [req-c4011156-1676-4627-ba87-28ee36575841 req-0ebb1abc-360b-4bdf-ae55-924593d141d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-98535402-ae8c-46cb-bfbc-6011e34adc25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.132 226890 DEBUG nova.network.neutron [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.158 226890 INFO nova.compute.manager [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Took 1.15 seconds to deallocate network for instance.#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.203 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.204 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.271 226890 DEBUG oslo_concurrency.processutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:16.500 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:16.501 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:16.501 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:16.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:16.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3107279431' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.721 226890 DEBUG oslo_concurrency.processutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.727 226890 DEBUG nova.compute.provider_tree [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.747 226890 DEBUG nova.scheduler.client.report [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.772 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.775 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.775 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.775 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.776 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.839 226890 INFO nova.scheduler.client.report [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 98535402-ae8c-46cb-bfbc-6011e34adc25#033[00m
Jan 20 10:36:16 np0005588920 nova_compute[226886]: 2026-01-20 15:36:16.917 226890 DEBUG oslo_concurrency.lockutils [None req-adb6c959-2d19-4549-9621-228687c515d4 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.159 226890 DEBUG nova.compute.manager [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.159 226890 DEBUG oslo_concurrency.lockutils [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.159 226890 DEBUG oslo_concurrency.lockutils [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.160 226890 DEBUG oslo_concurrency.lockutils [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98535402-ae8c-46cb-bfbc-6011e34adc25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.160 226890 DEBUG nova.compute.manager [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] No waiting events found dispatching network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.160 226890 WARNING nova.compute.manager [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received unexpected event network-vif-plugged-23586d88-de8c-4135-b0a0-1a967326da06 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.160 226890 DEBUG nova.compute.manager [req-1173b98b-307d-4713-a23a-5c038e097aac req-eadf4973-1edf-40c4-8e37-2b6da2dde30d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Received event network-vif-deleted-23586d88-de8c-4135-b0a0-1a967326da06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1875694725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.236 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.405 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.406 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4092MB free_disk=20.952808380126953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.406 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.406 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.454 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.455 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.478 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1689596933' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.881 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.888 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.909 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.934 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:36:17 np0005588920 nova_compute[226886]: 2026-01-20 15:36:17.934 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:18.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:18.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:19 np0005588920 nova_compute[226886]: 2026-01-20 15:36:19.255 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:19 np0005588920 nova_compute[226886]: 2026-01-20 15:36:19.540 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:20 np0005588920 nova_compute[226886]: 2026-01-20 15:36:20.172 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:20 np0005588920 nova_compute[226886]: 2026-01-20 15:36:20.250 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:20.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:20.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:22.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:22.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:22 np0005588920 nova_compute[226886]: 2026-01-20 15:36:22.935 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:22 np0005588920 nova_compute[226886]: 2026-01-20 15:36:22.935 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:36:22 np0005588920 nova_compute[226886]: 2026-01-20 15:36:22.935 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:36:23 np0005588920 nova_compute[226886]: 2026-01-20 15:36:23.003 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:36:23 np0005588920 nova_compute[226886]: 2026-01-20 15:36:23.003 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:23 np0005588920 nova_compute[226886]: 2026-01-20 15:36:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:24 np0005588920 nova_compute[226886]: 2026-01-20 15:36:24.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:24 np0005588920 nova_compute[226886]: 2026-01-20 15:36:24.541 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:24.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:24.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:25 np0005588920 nova_compute[226886]: 2026-01-20 15:36:25.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:26.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:26.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:26 np0005588920 nova_compute[226886]: 2026-01-20 15:36:26.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:26 np0005588920 nova_compute[226886]: 2026-01-20 15:36:26.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:28.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:28.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:29 np0005588920 nova_compute[226886]: 2026-01-20 15:36:29.258 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:29 np0005588920 nova_compute[226886]: 2026-01-20 15:36:29.514 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923374.5130968, 98535402-ae8c-46cb-bfbc-6011e34adc25 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:36:29 np0005588920 nova_compute[226886]: 2026-01-20 15:36:29.515 226890 INFO nova.compute.manager [-] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:36:29 np0005588920 nova_compute[226886]: 2026-01-20 15:36:29.542 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:29 np0005588920 nova_compute[226886]: 2026-01-20 15:36:29.584 226890 DEBUG nova.compute.manager [None req-0ff82380-35be-49ea-a9dd-3dd346cfb13e - - - - - -] [instance: 98535402-ae8c-46cb-bfbc-6011e34adc25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:36:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:30.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:30.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:32.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:32.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:32 np0005588920 nova_compute[226886]: 2026-01-20 15:36:32.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:32 np0005588920 nova_compute[226886]: 2026-01-20 15:36:32.724 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:36:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:34 np0005588920 nova_compute[226886]: 2026-01-20 15:36:34.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:34 np0005588920 nova_compute[226886]: 2026-01-20 15:36:34.545 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:34.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:34.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:34 np0005588920 podman[309801]: 2026-01-20 15:36:34.990817194 +0000 UTC m=+0.078006445 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:36:35 np0005588920 nova_compute[226886]: 2026-01-20 15:36:35.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:36.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:36.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:38.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:38.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:39 np0005588920 nova_compute[226886]: 2026-01-20 15:36:39.261 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:39 np0005588920 nova_compute[226886]: 2026-01-20 15:36:39.546 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:40.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:42.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:42.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.371 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.371 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.385 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.462 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.463 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.514 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.514 226890 INFO nova.compute.claims [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:36:43 np0005588920 nova_compute[226886]: 2026-01-20 15:36:43.745 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:43 np0005588920 podman[309848]: 2026-01-20 15:36:43.960826843 +0000 UTC m=+0.047192600 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:36:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2033270823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.181 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.188 226890 DEBUG nova.compute.provider_tree [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.214 226890 DEBUG nova.scheduler.client.report [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.256 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.257 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.262 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.323 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.323 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.344 226890 INFO nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.368 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.486 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.488 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.488 226890 INFO nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Creating image(s)#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.512 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.532 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.602 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.606 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:44.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.638 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.642 226890 DEBUG nova.policy [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.680 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.681 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.682 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.682 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.705 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.708 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 448068ae-e12d-44db-be1e-aab18ec6bf69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:44.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:44 np0005588920 nova_compute[226886]: 2026-01-20 15:36:44.971 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 448068ae-e12d-44db-be1e-aab18ec6bf69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.046 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.142 226890 DEBUG nova.objects.instance [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 448068ae-e12d-44db-be1e-aab18ec6bf69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.171 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.172 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Ensure instance console log exists: /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.172 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.172 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:45 np0005588920 nova_compute[226886]: 2026-01-20 15:36:45.173 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:46.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:46.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:46 np0005588920 nova_compute[226886]: 2026-01-20 15:36:46.859 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Successfully created port: 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.570 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Successfully updated port: 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.588 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.589 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.589 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.670 226890 DEBUG nova.compute.manager [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.670 226890 DEBUG nova.compute.manager [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing instance network info cache due to event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.670 226890 DEBUG oslo_concurrency.lockutils [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:36:47 np0005588920 nova_compute[226886]: 2026-01-20 15:36:47.754 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:36:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:48.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.118 226890 DEBUG nova.network.neutron [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.156 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.156 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Instance network_info: |[{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.157 226890 DEBUG oslo_concurrency.lockutils [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.157 226890 DEBUG nova.network.neutron [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.160 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Start _get_guest_xml network_info=[{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.168 226890 WARNING nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.175 226890 DEBUG nova.virt.libvirt.host [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.176 226890 DEBUG nova.virt.libvirt.host [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.184 226890 DEBUG nova.virt.libvirt.host [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.185 226890 DEBUG nova.virt.libvirt.host [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.186 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.186 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.187 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.187 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.187 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.187 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.187 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.188 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.188 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.188 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.188 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.188 226890 DEBUG nova.virt.hardware [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.191 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.264 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:36:49 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/521345695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.649 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.672 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.709 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:49 np0005588920 nova_compute[226886]: 2026-01-20 15:36:49.716 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:36:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/940201291' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.164 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.167 226890 DEBUG nova.virt.libvirt.vif [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:36:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36434333',display_name='tempest-TestNetworkBasicOps-server-36434333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36434333',id=213,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmXNz5iC21rSw9frG/tYMEZfAaZZQMophlhhWlqfNanOEERbqiQdgrmdDphltOag9NUoEg9YTEbCYJogCyo1wy+ArBGraFEWTtl6g8+Am3Ib6bk6goIdDCUuYmAe70jlw==',key_name='tempest-TestNetworkBasicOps-2119238938',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-s6bjvw22',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:36:44Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=448068ae-e12d-44db-be1e-aab18ec6bf69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.167 226890 DEBUG nova.network.os_vif_util [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.169 226890 DEBUG nova.network.os_vif_util [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.171 226890 DEBUG nova.objects.instance [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 448068ae-e12d-44db-be1e-aab18ec6bf69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.221 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <uuid>448068ae-e12d-44db-be1e-aab18ec6bf69</uuid>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <name>instance-000000d5</name>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:name>tempest-TestNetworkBasicOps-server-36434333</nova:name>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:36:49</nova:creationTime>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <nova:port uuid="5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="serial">448068ae-e12d-44db-be1e-aab18ec6bf69</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="uuid">448068ae-e12d-44db-be1e-aab18ec6bf69</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/448068ae-e12d-44db-be1e-aab18ec6bf69_disk">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:bf:54:11"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <target dev="tap5d9bd6b8-e9"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/console.log" append="off"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:36:50 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:36:50 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:36:50 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:36:50 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.223 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Preparing to wait for external event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.223 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.224 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.224 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.225 226890 DEBUG nova.virt.libvirt.vif [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:36:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36434333',display_name='tempest-TestNetworkBasicOps-server-36434333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36434333',id=213,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmXNz5iC21rSw9frG/tYMEZfAaZZQMophlhhWlqfNanOEERbqiQdgrmdDphltOag9NUoEg9YTEbCYJogCyo1wy+ArBGraFEWTtl6g8+Am3Ib6bk6goIdDCUuYmAe70jlw==',key_name='tempest-TestNetworkBasicOps-2119238938',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-s6bjvw22',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:36:44Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=448068ae-e12d-44db-be1e-aab18ec6bf69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.226 226890 DEBUG nova.network.os_vif_util [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.227 226890 DEBUG nova.network.os_vif_util [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.227 226890 DEBUG os_vif [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.228 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.229 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.229 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.234 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d9bd6b8-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.235 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d9bd6b8-e9, col_values=(('external_ids', {'iface-id': '5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:54:11', 'vm-uuid': '448068ae-e12d-44db-be1e-aab18ec6bf69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:50 np0005588920 NetworkManager[49076]: <info>  [1768923410.2380] manager: (tap5d9bd6b8-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/456)
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.239 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.244 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.245 226890 INFO os_vif [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9')#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.317 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.318 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.319 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:bf:54:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.320 226890 INFO nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Using config drive#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.358 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:50.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.958 226890 INFO nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Creating config drive at /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config#033[00m
Jan 20 10:36:50 np0005588920 nova_compute[226886]: 2026-01-20 15:36:50.964 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpffh99jg9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.112 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpffh99jg9" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.148 226890 DEBUG nova.storage.rbd_utils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.154 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config 448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.200 226890 DEBUG nova.network.neutron [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated VIF entry in instance network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.201 226890 DEBUG nova.network.neutron [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.220 226890 DEBUG oslo_concurrency.lockutils [req-4d634d1e-d1d6-4ddc-b8df-51edec374617 req-d0acbd89-2528-4bd5-8e6c-7900bee554f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.353 226890 DEBUG oslo_concurrency.processutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config 448068ae-e12d-44db-be1e-aab18ec6bf69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.354 226890 INFO nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Deleting local config drive /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69/disk.config because it was imported into RBD.#033[00m
Jan 20 10:36:51 np0005588920 kernel: tap5d9bd6b8-e9: entered promiscuous mode
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.4120] manager: (tap5d9bd6b8-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/457)
Jan 20 10:36:51 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:51Z|00970|binding|INFO|Claiming lport 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for this chassis.
Jan 20 10:36:51 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:51Z|00971|binding|INFO|5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140: Claiming fa:16:3e:bf:54:11 10.100.0.13
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.425 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:54:11 10.100.0.13'], port_security=['fa:16:3e:bf:54:11 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '448068ae-e12d-44db-be1e-aab18ec6bf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff5c7c17-408f-4158-a3de-418e7321dde0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10fabbbe-46a8-4773-85b5-859f8d94e243, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.426 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 in datapath ce71b376-fc91-4f6b-9838-8ea300ca70de bound to our chassis#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.427 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce71b376-fc91-4f6b-9838-8ea300ca70de#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.437 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d9dcaa-24de-42f8-8d44-ed66ac33f42a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.438 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce71b376-f1 in ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.440 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce71b376-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.440 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[b4468214-9f34-489b-8aaf-e7816037796e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.441 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3844404f-55ab-4db5-b6d2-5bdb8ca59355]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 systemd-machined[196121]: New machine qemu-100-instance-000000d5.
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.452 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[caca5ab1-9c62-46eb-a975-2e55b8c72dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 systemd[1]: Started Virtual Machine qemu-100-instance-000000d5.
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.476 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3f46edd1-d4c4-4324-81eb-6279a73e9088]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 systemd-udevd[310172]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:36:51 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:51Z|00972|binding|INFO|Setting lport 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 ovn-installed in OVS
Jan 20 10:36:51 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:51Z|00973|binding|INFO|Setting lport 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 up in Southbound
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.485 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.4990] device (tap5d9bd6b8-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.5000] device (tap5d9bd6b8-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.506 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[22221313-7507-4d88-bf2b-118cd7c3d10b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 systemd-udevd[310177]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.511 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0b459a69-2621-4aa3-9537-92380b9b46f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.5131] manager: (tapce71b376-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/458)
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.545 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[b06f0958-7ffa-4c52-8055-7a9d7a829b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.549 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[5614d0bb-05a0-4c6a-8118-0769102cba1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.5745] device (tapce71b376-f0): carrier: link connected
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.580 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[619dc96e-ced3-4a48-9556-6547202a8a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.602 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8015fe16-6b9a-44b7-90d3-9ffcbb9d63df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce71b376-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:63:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 304], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845326, 'reachable_time': 40212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310202, 'error': None, 'target': 'ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.620 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e079797-1075-4619-ac15-bf9c9d52ce95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:63ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 845326, 'tstamp': 845326}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310203, 'error': None, 'target': 'ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.642 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ad40ff-5417-4211-9038-ca3f14ad04e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce71b376-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:63:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 304], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845326, 'reachable_time': 40212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310204, 'error': None, 'target': 'ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.680 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[27b66906-30f8-48bd-8951-38ff1781213b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.739 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[1591595a-aec7-4885-b648-f65ece864ff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.742 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce71b376-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.742 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.742 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce71b376-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 kernel: tapce71b376-f0: entered promiscuous mode
Jan 20 10:36:51 np0005588920 NetworkManager[49076]: <info>  [1768923411.7468] manager: (tapce71b376-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/459)
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.750 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce71b376-f0, col_values=(('external_ids', {'iface-id': 'e939c0cd-c70f-4392-99e6-adb0b7314e89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.752 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:51Z|00974|binding|INFO|Releasing lport e939c0cd-c70f-4392-99e6-adb0b7314e89 from this chassis (sb_readonly=0)
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.753 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.754 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce71b376-fc91-4f6b-9838-8ea300ca70de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce71b376-fc91-4f6b-9838-8ea300ca70de.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.756 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[50f6db5b-1ac0-47c9-8bab-b4490de9cc10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.757 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-ce71b376-fc91-4f6b-9838-8ea300ca70de
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/ce71b376-fc91-4f6b-9838-8ea300ca70de.pid.haproxy
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID ce71b376-fc91-4f6b-9838-8ea300ca70de
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:36:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:36:51.758 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'env', 'PROCESS_TAG=haproxy-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce71b376-fc91-4f6b-9838-8ea300ca70de.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.766 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.928 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923411.9275436, 448068ae-e12d-44db-be1e-aab18ec6bf69 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.928 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] VM Started (Lifecycle Event)#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.955 226890 DEBUG nova.compute.manager [req-ebdb8e1d-0f50-4cdb-8d87-a3f4216e0198 req-f24f557c-e0cf-4b71-8178-469eeb7bbe04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.956 226890 DEBUG oslo_concurrency.lockutils [req-ebdb8e1d-0f50-4cdb-8d87-a3f4216e0198 req-f24f557c-e0cf-4b71-8178-469eeb7bbe04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.956 226890 DEBUG oslo_concurrency.lockutils [req-ebdb8e1d-0f50-4cdb-8d87-a3f4216e0198 req-f24f557c-e0cf-4b71-8178-469eeb7bbe04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.956 226890 DEBUG oslo_concurrency.lockutils [req-ebdb8e1d-0f50-4cdb-8d87-a3f4216e0198 req-f24f557c-e0cf-4b71-8178-469eeb7bbe04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.956 226890 DEBUG nova.compute.manager [req-ebdb8e1d-0f50-4cdb-8d87-a3f4216e0198 req-f24f557c-e0cf-4b71-8178-469eeb7bbe04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Processing event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.957 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.965 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.969 226890 INFO nova.virt.libvirt.driver [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Instance spawned successfully.#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.970 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.972 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:36:51 np0005588920 nova_compute[226886]: 2026-01-20 15:36:51.979 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.003 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.004 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.004 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.005 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.005 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.005 226890 DEBUG nova.virt.libvirt.driver [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.010 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.010 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923411.927641, 448068ae-e12d-44db-be1e-aab18ec6bf69 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.010 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.049 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.052 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923411.9651344, 448068ae-e12d-44db-be1e-aab18ec6bf69 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.052 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.074 226890 INFO nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Took 7.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.075 226890 DEBUG nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.076 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.082 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:36:52 np0005588920 podman[310275]: 2026-01-20 15:36:52.123528584 +0000 UTC m=+0.045889563 container create 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.134 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.169 226890 INFO nova.compute.manager [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Took 8.74 seconds to build instance.#033[00m
Jan 20 10:36:52 np0005588920 systemd[1]: Started libpod-conmon-5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4.scope.
Jan 20 10:36:52 np0005588920 nova_compute[226886]: 2026-01-20 15:36:52.195 226890 DEBUG oslo_concurrency.lockutils [None req-50fa340c-7f89-47ce-96b0-f15a2ac3a91b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:52 np0005588920 podman[310275]: 2026-01-20 15:36:52.099955475 +0000 UTC m=+0.022316564 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:36:52 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:36:52 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a2676f2af25b8f8516cc73ff1c3f6c57fb339e96e92a35e46127b62ebdf7cb0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:36:52 np0005588920 podman[310275]: 2026-01-20 15:36:52.233253377 +0000 UTC m=+0.155614376 container init 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:36:52 np0005588920 podman[310275]: 2026-01-20 15:36:52.2386332 +0000 UTC m=+0.160994179 container start 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:36:52 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [NOTICE]   (310294) : New worker (310296) forked
Jan 20 10:36:52 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [NOTICE]   (310294) : Loading success.
Jan 20 10:36:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:52.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.062 226890 DEBUG nova.compute.manager [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.063 226890 DEBUG oslo_concurrency.lockutils [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.063 226890 DEBUG oslo_concurrency.lockutils [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.063 226890 DEBUG oslo_concurrency.lockutils [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.064 226890 DEBUG nova.compute.manager [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.064 226890 WARNING nova.compute.manager [req-f6b55cf6-6585-4e31-bd10-d6663235541f req-16bfa7b5-6c7f-4b34-998a-d4e45a79e38e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:36:54 np0005588920 nova_compute[226886]: 2026-01-20 15:36:54.266 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:54.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:54.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:55 np0005588920 nova_compute[226886]: 2026-01-20 15:36:55.238 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:55 np0005588920 nova_compute[226886]: 2026-01-20 15:36:55.839 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:55 np0005588920 NetworkManager[49076]: <info>  [1768923415.8397] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/460)
Jan 20 10:36:55 np0005588920 NetworkManager[49076]: <info>  [1768923415.8413] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/461)
Jan 20 10:36:55 np0005588920 nova_compute[226886]: 2026-01-20 15:36:55.842 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:55 np0005588920 nova_compute[226886]: 2026-01-20 15:36:55.844 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:55Z|00975|binding|INFO|Releasing lport e939c0cd-c70f-4392-99e6-adb0b7314e89 from this chassis (sb_readonly=0)
Jan 20 10:36:55 np0005588920 ovn_controller[133971]: 2026-01-20T15:36:55Z|00976|binding|INFO|Releasing lport e939c0cd-c70f-4392-99e6-adb0b7314e89 from this chassis (sb_readonly=0)
Jan 20 10:36:56 np0005588920 nova_compute[226886]: 2026-01-20 15:36:56.203 226890 DEBUG nova.compute.manager [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:36:56 np0005588920 nova_compute[226886]: 2026-01-20 15:36:56.204 226890 DEBUG nova.compute.manager [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing instance network info cache due to event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:36:56 np0005588920 nova_compute[226886]: 2026-01-20 15:36:56.204 226890 DEBUG oslo_concurrency.lockutils [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:36:56 np0005588920 nova_compute[226886]: 2026-01-20 15:36:56.204 226890 DEBUG oslo_concurrency.lockutils [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:36:56 np0005588920 nova_compute[226886]: 2026-01-20 15:36:56.205 226890 DEBUG nova.network.neutron [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:36:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:56.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:57 np0005588920 nova_compute[226886]: 2026-01-20 15:36:57.428 226890 DEBUG nova.network.neutron [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated VIF entry in instance network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:36:57 np0005588920 nova_compute[226886]: 2026-01-20 15:36:57.429 226890 DEBUG nova.network.neutron [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:36:57 np0005588920 nova_compute[226886]: 2026-01-20 15:36:57.455 226890 DEBUG oslo_concurrency.lockutils [req-1160a94e-2e8f-4051-9615-9b6be4ac1ed6 req-656c4857-ca08-49c6-97f6-c00c8f8a7d98 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:36:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:58.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:36:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:58.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:59 np0005588920 nova_compute[226886]: 2026-01-20 15:36:59.268 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:00 np0005588920 nova_compute[226886]: 2026-01-20 15:37:00.240 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:00.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:02.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:02.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:04 np0005588920 nova_compute[226886]: 2026-01-20 15:37:04.271 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:04.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:05 np0005588920 nova_compute[226886]: 2026-01-20 15:37:05.243 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:37:05Z|00137|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:54:11 10.100.0.13
Jan 20 10:37:05 np0005588920 ovn_controller[133971]: 2026-01-20T15:37:05Z|00138|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:54:11 10.100.0.13
Jan 20 10:37:06 np0005588920 podman[310306]: 2026-01-20 15:37:06.073277665 +0000 UTC m=+0.148107563 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:37:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:06.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:06.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:08.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:08.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:09 np0005588920 nova_compute[226886]: 2026-01-20 15:37:09.274 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:10 np0005588920 nova_compute[226886]: 2026-01-20 15:37:10.245 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:10.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:10.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:12.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:12.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:14 np0005588920 nova_compute[226886]: 2026-01-20 15:37:14.277 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:14.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:14.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:14 np0005588920 podman[310335]: 2026-01-20 15:37:14.970061296 +0000 UTC m=+0.053994523 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:37:15 np0005588920 nova_compute[226886]: 2026-01-20 15:37:15.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:16.501 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:16.502 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:16.502 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:16.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.747 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.747 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:37:16 np0005588920 nova_compute[226886]: 2026-01-20 15:37:16.748 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:37:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:16.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:37:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2535838283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.209 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.319 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000d5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.319 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000d5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.456 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.457 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3901MB free_disk=20.921852111816406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.457 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.457 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.551 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance 448068ae-e12d-44db-be1e-aab18ec6bf69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.551 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.551 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:37:17 np0005588920 nova_compute[226886]: 2026-01-20 15:37:17.665 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:37:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:37:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1014966892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:37:18 np0005588920 nova_compute[226886]: 2026-01-20 15:37:18.090 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:37:18 np0005588920 nova_compute[226886]: 2026-01-20 15:37:18.097 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:37:18 np0005588920 nova_compute[226886]: 2026-01-20 15:37:18.117 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:37:18 np0005588920 nova_compute[226886]: 2026-01-20 15:37:18.134 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:37:18 np0005588920 nova_compute[226886]: 2026-01-20 15:37:18.135 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:18.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:18.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:19 np0005588920 nova_compute[226886]: 2026-01-20 15:37:19.278 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:37:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:37:20 np0005588920 nova_compute[226886]: 2026-01-20 15:37:20.290 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:20.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:20.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:22.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:22.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.135 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.135 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.135 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:37:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.668 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.669 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.669 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:37:23 np0005588920 nova_compute[226886]: 2026-01-20 15:37:23.669 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 448068ae-e12d-44db-be1e-aab18ec6bf69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:37:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:24 np0005588920 nova_compute[226886]: 2026-01-20 15:37:24.279 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:24.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:24.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.129 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.157 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.157 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.158 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.158 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:25 np0005588920 nova_compute[226886]: 2026-01-20 15:37:25.293 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:26.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:26.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:27 np0005588920 nova_compute[226886]: 2026-01-20 15:37:27.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:27 np0005588920 nova_compute[226886]: 2026-01-20 15:37:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:27 np0005588920 nova_compute[226886]: 2026-01-20 15:37:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:28.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:28.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:29 np0005588920 nova_compute[226886]: 2026-01-20 15:37:29.281 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:30 np0005588920 nova_compute[226886]: 2026-01-20 15:37:30.340 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:30.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:30 np0005588920 nova_compute[226886]: 2026-01-20 15:37:30.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:30.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:31 np0005588920 nova_compute[226886]: 2026-01-20 15:37:31.040 226890 INFO nova.compute.manager [None req-e883c4fe-1403-40ba-a6d3-8b7ca0e3e053 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Get console output#033[00m
Jan 20 10:37:31 np0005588920 nova_compute[226886]: 2026-01-20 15:37:31.046 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:37:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:31.896 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:37:31 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:31.897 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:37:31 np0005588920 nova_compute[226886]: 2026-01-20 15:37:31.898 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:32 np0005588920 nova_compute[226886]: 2026-01-20 15:37:32.209 226890 DEBUG nova.compute.manager [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:32 np0005588920 nova_compute[226886]: 2026-01-20 15:37:32.209 226890 DEBUG nova.compute.manager [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing instance network info cache due to event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:37:32 np0005588920 nova_compute[226886]: 2026-01-20 15:37:32.209 226890 DEBUG oslo_concurrency.lockutils [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:37:32 np0005588920 nova_compute[226886]: 2026-01-20 15:37:32.210 226890 DEBUG oslo_concurrency.lockutils [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:37:32 np0005588920 nova_compute[226886]: 2026-01-20 15:37:32.210 226890 DEBUG nova.network.neutron [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:37:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:32.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:32.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.239 226890 INFO nova.compute.manager [None req-16a1d5a9-43d1-4a21-b79c-a604ab229a9a 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Get console output#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.244 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.485 226890 DEBUG nova.network.neutron [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated VIF entry in instance network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.486 226890 DEBUG nova.network.neutron [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.506 226890 DEBUG oslo_concurrency.lockutils [req-1d90485a-9df2-4f0f-b436-f99c2b544205 req-bbe3793a-c68e-4d76-9ebd-c7600da411f9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:33 np0005588920 nova_compute[226886]: 2026-01-20 15:37:33.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:37:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.283 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.337 226890 DEBUG nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.337 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.337 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.338 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.338 226890 DEBUG nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.338 226890 WARNING nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.338 226890 DEBUG nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.338 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.339 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.339 226890 DEBUG oslo_concurrency.lockutils [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.339 226890 DEBUG nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:34 np0005588920 nova_compute[226886]: 2026-01-20 15:37:34.339 226890 WARNING nova.compute.manager [req-07818fa8-8746-4928-89e7-a93a01c1f2d3 req-8b543608-6930-4eda-b9ea-28d03873d08f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:37:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:34.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:34.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.344 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.756 226890 DEBUG nova.compute.manager [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.756 226890 DEBUG nova.compute.manager [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing instance network info cache due to event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.756 226890 DEBUG oslo_concurrency.lockutils [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.756 226890 DEBUG oslo_concurrency.lockutils [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.757 226890 DEBUG nova.network.neutron [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.859 226890 INFO nova.compute.manager [None req-ec1f4f79-43b7-46c9-961e-f05f2741f8d0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Get console output#033[00m
Jan 20 10:37:35 np0005588920 nova_compute[226886]: 2026-01-20 15:37:35.864 260344 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.450 226890 DEBUG nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.450 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.450 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.450 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.451 226890 DEBUG nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.451 226890 WARNING nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.451 226890 DEBUG nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.451 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.451 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.452 226890 DEBUG oslo_concurrency.lockutils [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.452 226890 DEBUG nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.452 226890 WARNING nova.compute.manager [req-98f29dbd-6271-40fd-8822-4a399ce21efb req-17b98549-b41a-45bf-a76b-4aef28600f2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:37:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:36.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:36 np0005588920 nova_compute[226886]: 2026-01-20 15:37:36.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:36.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:36 np0005588920 podman[310582]: 2026-01-20 15:37:36.994418128 +0000 UTC m=+0.083651064 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 20 10:37:38 np0005588920 nova_compute[226886]: 2026-01-20 15:37:38.549 226890 DEBUG nova.network.neutron [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated VIF entry in instance network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:37:38 np0005588920 nova_compute[226886]: 2026-01-20 15:37:38.549 226890 DEBUG nova.network.neutron [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:37:38 np0005588920 nova_compute[226886]: 2026-01-20 15:37:38.581 226890 DEBUG oslo_concurrency.lockutils [req-a7144ec6-e362-42cc-8b31-30e011131d19 req-b9cd1f30-d933-438f-986c-d8a4aca4ee13 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:37:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:38.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:38.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:39 np0005588920 nova_compute[226886]: 2026-01-20 15:37:39.284 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:40 np0005588920 nova_compute[226886]: 2026-01-20 15:37:40.345 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:40.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:40.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:40 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:40.899 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:37:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:42.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:42.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.258 226890 DEBUG nova.compute.manager [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.259 226890 DEBUG nova.compute.manager [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing instance network info cache due to event network-changed-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.259 226890 DEBUG oslo_concurrency.lockutils [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.259 226890 DEBUG oslo_concurrency.lockutils [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.259 226890 DEBUG nova.network.neutron [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Refreshing network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.375 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.376 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.376 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.376 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.376 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.377 226890 INFO nova.compute.manager [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Terminating instance#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.378 226890 DEBUG nova.compute.manager [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:37:43 np0005588920 kernel: tap5d9bd6b8-e9 (unregistering): left promiscuous mode
Jan 20 10:37:43 np0005588920 NetworkManager[49076]: <info>  [1768923463.4563] device (tap5d9bd6b8-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.471 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:37:43Z|00977|binding|INFO|Releasing lport 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 from this chassis (sb_readonly=0)
Jan 20 10:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:37:43Z|00978|binding|INFO|Setting lport 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 down in Southbound
Jan 20 10:37:43 np0005588920 ovn_controller[133971]: 2026-01-20T15:37:43Z|00979|binding|INFO|Removing iface tap5d9bd6b8-e9 ovn-installed in OVS
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.478 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:54:11 10.100.0.13'], port_security=['fa:16:3e:bf:54:11 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '448068ae-e12d-44db-be1e-aab18ec6bf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ff5c7c17-408f-4158-a3de-418e7321dde0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10fabbbe-46a8-4773-85b5-859f8d94e243, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.480 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 in datapath ce71b376-fc91-4f6b-9838-8ea300ca70de unbound from our chassis#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.481 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce71b376-fc91-4f6b-9838-8ea300ca70de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.482 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0d14c0-bb4c-4a60-b63e-3851340cbff5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.483 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de namespace which is not needed anymore#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.492 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d5.scope: Deactivated successfully.
Jan 20 10:37:43 np0005588920 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d5.scope: Consumed 14.115s CPU time.
Jan 20 10:37:43 np0005588920 systemd-machined[196121]: Machine qemu-100-instance-000000d5 terminated.
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.599 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.605 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.612 226890 INFO nova.virt.libvirt.driver [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Instance destroyed successfully.#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.613 226890 DEBUG nova.objects.instance [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 448068ae-e12d-44db-be1e-aab18ec6bf69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [NOTICE]   (310294) : haproxy version is 2.8.14-c23fe91
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [NOTICE]   (310294) : path to executable is /usr/sbin/haproxy
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [WARNING]  (310294) : Exiting Master process...
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [WARNING]  (310294) : Exiting Master process...
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [ALERT]    (310294) : Current worker (310296) exited with code 143 (Terminated)
Jan 20 10:37:43 np0005588920 neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de[310290]: [WARNING]  (310294) : All workers exited. Exiting... (0)
Jan 20 10:37:43 np0005588920 systemd[1]: libpod-5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4.scope: Deactivated successfully.
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.635 226890 DEBUG nova.virt.libvirt.vif [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:36:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36434333',display_name='tempest-TestNetworkBasicOps-server-36434333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36434333',id=213,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMmXNz5iC21rSw9frG/tYMEZfAaZZQMophlhhWlqfNanOEERbqiQdgrmdDphltOag9NUoEg9YTEbCYJogCyo1wy+ArBGraFEWTtl6g8+Am3Ib6bk6goIdDCUuYmAe70jlw==',key_name='tempest-TestNetworkBasicOps-2119238938',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:36:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-s6bjvw22',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:36:52Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=448068ae-e12d-44db-be1e-aab18ec6bf69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.635 226890 DEBUG nova.network.os_vif_util [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:37:43 np0005588920 podman[310636]: 2026-01-20 15:37:43.636394708 +0000 UTC m=+0.047219141 container died 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.637 226890 DEBUG nova.network.os_vif_util [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.638 226890 DEBUG os_vif [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.640 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.640 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d9bd6b8-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.642 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.643 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.645 226890 INFO os_vif [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:54:11,bridge_name='br-int',has_traffic_filtering=True,id=5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140,network=Network(ce71b376-fc91-4f6b-9838-8ea300ca70de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9bd6b8-e9')#033[00m
Jan 20 10:37:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4-userdata-shm.mount: Deactivated successfully.
Jan 20 10:37:43 np0005588920 systemd[1]: var-lib-containers-storage-overlay-9a2676f2af25b8f8516cc73ff1c3f6c57fb339e96e92a35e46127b62ebdf7cb0-merged.mount: Deactivated successfully.
Jan 20 10:37:43 np0005588920 podman[310636]: 2026-01-20 15:37:43.672467232 +0000 UTC m=+0.083291665 container cleanup 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:37:43 np0005588920 systemd[1]: libpod-conmon-5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4.scope: Deactivated successfully.
Jan 20 10:37:43 np0005588920 podman[310690]: 2026-01-20 15:37:43.744948219 +0000 UTC m=+0.047632033 container remove 5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.751 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[63e77073-8ffe-4572-9459-9ef4840aae23]: (4, ('Tue Jan 20 03:37:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de (5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4)\n5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4\nTue Jan 20 03:37:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de (5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4)\n5630d8f7e3ab2318e932c0765668f740c4ee4fe84d759770a04b66c5caa6a9b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.753 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8b6a69-a847-41b0-bbe9-487ae4136163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.754 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce71b376-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.756 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 kernel: tapce71b376-f0: left promiscuous mode
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.783 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.786 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[bf31dfea-8bbe-4ec3-9bab-46f746d50766]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.806 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[51eb1b9e-2009-4f46-94ba-5a85f277adce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.807 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[908ea3f8-e8e7-48ee-979d-e180de31eafe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.825 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[832474ea-cbac-45ad-9354-3fc7d03c04f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845318, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310707, 'error': None, 'target': 'ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.828 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce71b376-fc91-4f6b-9838-8ea300ca70de deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:37:43 np0005588920 systemd[1]: run-netns-ovnmeta\x2dce71b376\x2dfc91\x2d4f6b\x2d9838\x2d8ea300ca70de.mount: Deactivated successfully.
Jan 20 10:37:43 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:37:43.829 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[66502720-e0e8-43f9-820d-3f9502deeba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.913 226890 DEBUG nova.compute.manager [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.914 226890 DEBUG oslo_concurrency.lockutils [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.914 226890 DEBUG oslo_concurrency.lockutils [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.915 226890 DEBUG oslo_concurrency.lockutils [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.915 226890 DEBUG nova.compute.manager [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:43 np0005588920 nova_compute[226886]: 2026-01-20 15:37:43.915 226890 DEBUG nova.compute.manager [req-07b35917-76b6-4c73-a9fe-487f0f3be96b req-9e125094-b5ad-47e3-93af-43a631783673 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-unplugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.061 226890 INFO nova.virt.libvirt.driver [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Deleting instance files /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69_del#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.061 226890 INFO nova.virt.libvirt.driver [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Deletion of /var/lib/nova/instances/448068ae-e12d-44db-be1e-aab18ec6bf69_del complete#033[00m
Jan 20 10:37:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.137 226890 INFO nova.compute.manager [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.137 226890 DEBUG oslo.service.loopingcall [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.138 226890 DEBUG nova.compute.manager [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.138 226890 DEBUG nova.network.neutron [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:37:44 np0005588920 nova_compute[226886]: 2026-01-20 15:37:44.287 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:44.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:44.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.140 226890 DEBUG nova.network.neutron [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.165 226890 INFO nova.compute.manager [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Took 1.03 seconds to deallocate network for instance.#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.228 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.229 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.375 226890 DEBUG oslo_concurrency.processutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.406 226890 DEBUG nova.compute.manager [req-ae7d884c-78b7-4a2a-8ccd-70ba8aad826f req-8005e68f-bec8-46b5-8fc0-f8d6c91b63da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-deleted-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:37:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/496827110' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.803 226890 DEBUG oslo_concurrency.processutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.809 226890 DEBUG nova.compute.provider_tree [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.825 226890 DEBUG nova.scheduler.client.report [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.848 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.860 226890 DEBUG nova.network.neutron [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updated VIF entry in instance network info cache for port 5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.861 226890 DEBUG nova.network.neutron [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Updating instance_info_cache with network_info: [{"id": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "address": "fa:16:3e:bf:54:11", "network": {"id": "ce71b376-fc91-4f6b-9838-8ea300ca70de", "bridge": "br-int", "label": "tempest-network-smoke--315983280", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9bd6b8-e9", "ovs_interfaceid": "5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.875 226890 INFO nova.scheduler.client.report [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 448068ae-e12d-44db-be1e-aab18ec6bf69#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.892 226890 DEBUG oslo_concurrency.lockutils [req-e26fb9fb-8cd3-44fb-a7bb-2ef829e19048 req-ebd2a376-71d7-47c2-a847-62956d4ba546 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-448068ae-e12d-44db-be1e-aab18ec6bf69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:37:45 np0005588920 nova_compute[226886]: 2026-01-20 15:37:45.967 226890 DEBUG oslo_concurrency.lockutils [None req-feb998bb-d6cc-4c3a-80c9-5d7f5e3a8b97 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:45 np0005588920 podman[310731]: 2026-01-20 15:37:45.991069824 +0000 UTC m=+0.075017940 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.042 226890 DEBUG nova.compute.manager [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.042 226890 DEBUG oslo_concurrency.lockutils [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.042 226890 DEBUG oslo_concurrency.lockutils [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.042 226890 DEBUG oslo_concurrency.lockutils [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "448068ae-e12d-44db-be1e-aab18ec6bf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.042 226890 DEBUG nova.compute.manager [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] No waiting events found dispatching network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:37:46 np0005588920 nova_compute[226886]: 2026-01-20 15:37:46.043 226890 WARNING nova.compute.manager [req-d6b4dbc4-57fb-4294-ae19-a830bfbef0fa req-15e8c433-3bf6-42dd-a84d-ba8b4eb78155 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Received unexpected event network-vif-plugged-5d9bd6b8-e9ed-4d50-bc5e-ed9ac2aea140 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:37:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:46.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:46.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.420582) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468420643, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 2200, "num_deletes": 252, "total_data_size": 5409694, "memory_usage": 5485640, "flush_reason": "Manual Compaction"}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468441951, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 2036286, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82030, "largest_seqno": 84225, "table_properties": {"data_size": 2029966, "index_size": 3201, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16729, "raw_average_key_size": 20, "raw_value_size": 2015882, "raw_average_value_size": 2513, "num_data_blocks": 145, "num_entries": 802, "num_filter_entries": 802, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923260, "oldest_key_time": 1768923260, "file_creation_time": 1768923468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 21428 microseconds, and 4941 cpu microseconds.
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.442004) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 2036286 bytes OK
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.442026) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.444517) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.444545) EVENT_LOG_v1 {"time_micros": 1768923468444536, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.444570) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 5400006, prev total WAL file size 5400006, number of live WAL files 2.
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.446186) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373537' seq:72057594037927935, type:22 .. '6D6772737461740033303130' seq:0, type:0; will stop at (end)
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(1988KB)], [168(12MB)]
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468446269, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14993286, "oldest_snapshot_seqno": -1}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 10614 keys, 12636649 bytes, temperature: kUnknown
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468589680, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 12636649, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12570261, "index_size": 38755, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 278504, "raw_average_key_size": 26, "raw_value_size": 12386652, "raw_average_value_size": 1167, "num_data_blocks": 1478, "num_entries": 10614, "num_filter_entries": 10614, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.589997) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 12636649 bytes
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.593450) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 104.5 rd, 88.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(13.6) write-amplify(6.2) OK, records in: 11032, records dropped: 418 output_compression: NoCompression
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.593470) EVENT_LOG_v1 {"time_micros": 1768923468593462, "job": 108, "event": "compaction_finished", "compaction_time_micros": 143529, "compaction_time_cpu_micros": 41208, "output_level": 6, "num_output_files": 1, "total_output_size": 12636649, "num_input_records": 11032, "num_output_records": 10614, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468594078, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468596859, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.446112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.597028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.597038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.597041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.597044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:37:48.597046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588920 nova_compute[226886]: 2026-01-20 15:37:48.644 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:48.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:48.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:49 np0005588920 nova_compute[226886]: 2026-01-20 15:37:49.288 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:50.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:50.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:37:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 84K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1488 writes, 7137 keys, 1488 commit groups, 1.0 writes per commit group, ingest: 15.62 MB, 0.03 MB/s#012Interval WAL: 1488 writes, 1488 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     75.2      1.36              0.37        54    0.025       0      0       0.0       0.0#012  L6      1/0   12.05 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.3    104.3     89.4      6.05              1.65        53    0.114    407K    28K       0.0       0.0#012 Sum      1/0   12.05 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.3     85.2     86.8      7.42              2.02       107    0.069    407K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     61.8     61.4      1.09              0.18        10    0.109     53K   2443       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0    104.3     89.4      6.05              1.65        53    0.114    407K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     75.2      1.36              0.37        53    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.100, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.63 GB write, 0.11 MB/s write, 0.62 GB read, 0.11 MB/s read, 7.4 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 68.20 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000391 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3895,65.30 MB,21.4794%) FilterBlock(107,1.09 MB,0.358717%) IndexBlock(107,1.81 MB,0.596282%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:37:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:52.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:52.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:53 np0005588920 nova_compute[226886]: 2026-01-20 15:37:53.648 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:54 np0005588920 nova_compute[226886]: 2026-01-20 15:37:54.291 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:54.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:54.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:56.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:56.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:58 np0005588920 nova_compute[226886]: 2026-01-20 15:37:58.612 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923463.6106763, 448068ae-e12d-44db-be1e-aab18ec6bf69 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:37:58 np0005588920 nova_compute[226886]: 2026-01-20 15:37:58.613 226890 INFO nova.compute.manager [-] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:37:58 np0005588920 nova_compute[226886]: 2026-01-20 15:37:58.652 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:58.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:37:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:58.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:59 np0005588920 nova_compute[226886]: 2026-01-20 15:37:59.235 226890 DEBUG nova.compute.manager [None req-5d714d4e-eba4-4147-957a-7a1196c41d25 - - - - - -] [instance: 448068ae-e12d-44db-be1e-aab18ec6bf69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:37:59 np0005588920 nova_compute[226886]: 2026-01-20 15:37:59.294 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:00.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:00.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:02.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:03 np0005588920 nova_compute[226886]: 2026-01-20 15:38:03.657 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:04 np0005588920 nova_compute[226886]: 2026-01-20 15:38:04.295 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.687288) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484687597, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 395, "num_deletes": 251, "total_data_size": 408881, "memory_usage": 417032, "flush_reason": "Manual Compaction"}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 20 10:38:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484809499, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 269559, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84230, "largest_seqno": 84620, "table_properties": {"data_size": 267271, "index_size": 451, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5634, "raw_average_key_size": 18, "raw_value_size": 262744, "raw_average_value_size": 867, "num_data_blocks": 20, "num_entries": 303, "num_filter_entries": 303, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923468, "oldest_key_time": 1768923468, "file_creation_time": 1768923484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 122020 microseconds, and 1878 cpu microseconds.
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.809546) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 269559 bytes OK
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.809567) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811511) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811524) EVENT_LOG_v1 {"time_micros": 1768923484811520, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811540) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 406321, prev total WAL file size 406321, number of live WAL files 2.
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811905) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(263KB)], [171(12MB)]
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484811927, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 12906208, "oldest_snapshot_seqno": -1}
Jan 20 10:38:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:04.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10407 keys, 10873640 bytes, temperature: kUnknown
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484990467, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 10873640, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10810121, "index_size": 36398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 274884, "raw_average_key_size": 26, "raw_value_size": 10631512, "raw_average_value_size": 1021, "num_data_blocks": 1371, "num_entries": 10407, "num_filter_entries": 10407, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.990823) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 10873640 bytes
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.994562) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 72.2 rd, 60.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 12.1 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(88.2) write-amplify(40.3) OK, records in: 10917, records dropped: 510 output_compression: NoCompression
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.994592) EVENT_LOG_v1 {"time_micros": 1768923484994578, "job": 110, "event": "compaction_finished", "compaction_time_micros": 178660, "compaction_time_cpu_micros": 25786, "output_level": 6, "num_output_files": 1, "total_output_size": 10873640, "num_input_records": 10917, "num_output_records": 10407, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484994831, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484998930, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811873) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.999018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.999024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.999025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.999027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:04 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:38:04.999028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588920 nova_compute[226886]: 2026-01-20 15:38:05.074 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:05 np0005588920 nova_compute[226886]: 2026-01-20 15:38:05.151 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:06.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:07 np0005588920 podman[310754]: 2026-01-20 15:38:07.997774653 +0000 UTC m=+0.084538489 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 20 10:38:08 np0005588920 nova_compute[226886]: 2026-01-20 15:38:08.696 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:08.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:09 np0005588920 nova_compute[226886]: 2026-01-20 15:38:09.297 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:10.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:10.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:12.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:12.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:13 np0005588920 nova_compute[226886]: 2026-01-20 15:38:13.700 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:14 np0005588920 nova_compute[226886]: 2026-01-20 15:38:14.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:14.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:14.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:16.502 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:16.503 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:16.503 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:16.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:16.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:16 np0005588920 podman[310780]: 2026-01-20 15:38:16.993112512 +0000 UTC m=+0.068184836 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.704 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:18.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.843 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.844 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.844 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.844 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:38:18 np0005588920 nova_compute[226886]: 2026-01-20 15:38:18.845 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:38:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:18.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:38:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/420021600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.287 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.301 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.429 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.431 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4124MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.431 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.431 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.523 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.523 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:38:19 np0005588920 nova_compute[226886]: 2026-01-20 15:38:19.617 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:38:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:38:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2588081493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:38:20 np0005588920 nova_compute[226886]: 2026-01-20 15:38:20.050 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:38:20 np0005588920 nova_compute[226886]: 2026-01-20 15:38:20.057 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:38:20 np0005588920 nova_compute[226886]: 2026-01-20 15:38:20.075 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:38:20 np0005588920 nova_compute[226886]: 2026-01-20 15:38:20.609 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:38:20 np0005588920 nova_compute[226886]: 2026-01-20 15:38:20.609 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:20.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:20.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:22.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:38:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 76K writes, 307K keys, 76K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 76K writes, 28K syncs, 2.69 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3357 writes, 13K keys, 3357 commit groups, 1.0 writes per commit group, ingest: 15.59 MB, 0.03 MB/s#012Interval WAL: 3357 writes, 1321 syncs, 2.54 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Jan 20 10:38:23 np0005588920 nova_compute[226886]: 2026-01-20 15:38:23.609 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:23 np0005588920 nova_compute[226886]: 2026-01-20 15:38:23.610 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:38:23 np0005588920 nova_compute[226886]: 2026-01-20 15:38:23.610 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:38:23 np0005588920 nova_compute[226886]: 2026-01-20 15:38:23.706 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:23 np0005588920 nova_compute[226886]: 2026-01-20 15:38:23.768 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:38:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:24 np0005588920 nova_compute[226886]: 2026-01-20 15:38:24.304 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:24.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:24.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:25 np0005588920 nova_compute[226886]: 2026-01-20 15:38:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:25 np0005588920 nova_compute[226886]: 2026-01-20 15:38:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:26.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:26.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:38:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:38:27 np0005588920 nova_compute[226886]: 2026-01-20 15:38:27.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:28 np0005588920 nova_compute[226886]: 2026-01-20 15:38:28.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:28 np0005588920 nova_compute[226886]: 2026-01-20 15:38:28.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:28.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:28.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:29 np0005588920 nova_compute[226886]: 2026-01-20 15:38:29.306 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:29 np0005588920 nova_compute[226886]: 2026-01-20 15:38:29.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:30.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:30.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:32 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:32.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:32.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:33 np0005588920 nova_compute[226886]: 2026-01-20 15:38:33.749 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:34 np0005588920 nova_compute[226886]: 2026-01-20 15:38:34.332 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:35 np0005588920 nova_compute[226886]: 2026-01-20 15:38:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:35 np0005588920 nova_compute[226886]: 2026-01-20 15:38:35.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:38:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:36.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:38 np0005588920 nova_compute[226886]: 2026-01-20 15:38:38.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:38 np0005588920 nova_compute[226886]: 2026-01-20 15:38:38.754 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:38.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:38 np0005588920 podman[311028]: 2026-01-20 15:38:38.995906293 +0000 UTC m=+0.079417874 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:38:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:39 np0005588920 nova_compute[226886]: 2026-01-20 15:38:39.334 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:40.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:42.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:43 np0005588920 nova_compute[226886]: 2026-01-20 15:38:43.758 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:44 np0005588920 nova_compute[226886]: 2026-01-20 15:38:44.337 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:44.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:45 np0005588920 nova_compute[226886]: 2026-01-20 15:38:45.043 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:45.043 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:38:45 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:45.046 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:38:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:46.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:38:47.049 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:38:47 np0005588920 podman[311055]: 2026-01-20 15:38:47.96168725 +0000 UTC m=+0.045157363 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 20 10:38:48 np0005588920 nova_compute[226886]: 2026-01-20 15:38:48.763 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:48.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:49 np0005588920 nova_compute[226886]: 2026-01-20 15:38:49.337 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:50.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:50.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:52.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:52.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:53 np0005588920 nova_compute[226886]: 2026-01-20 15:38:53.767 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:54 np0005588920 nova_compute[226886]: 2026-01-20 15:38:54.341 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:54.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:54.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:55 np0005588920 ovn_controller[133971]: 2026-01-20T15:38:55Z|00980|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 20 10:38:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:56.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:58 np0005588920 nova_compute[226886]: 2026-01-20 15:38:58.770 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:58.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:38:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:59 np0005588920 nova_compute[226886]: 2026-01-20 15:38:59.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:00.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:00.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:02.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:02.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:03 np0005588920 nova_compute[226886]: 2026-01-20 15:39:03.922 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:04 np0005588920 nova_compute[226886]: 2026-01-20 15:39:04.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:04.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:06.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:08.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:08 np0005588920 nova_compute[226886]: 2026-01-20 15:39:08.970 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:09 np0005588920 nova_compute[226886]: 2026-01-20 15:39:09.384 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:09 np0005588920 podman[311074]: 2026-01-20 15:39:09.995976554 +0000 UTC m=+0.081008829 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:39:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:10.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:10.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:12.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:12.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:13 np0005588920 nova_compute[226886]: 2026-01-20 15:39:13.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:14 np0005588920 nova_compute[226886]: 2026-01-20 15:39:14.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:14.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:14.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:16.503 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:16.504 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:16.504 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:16.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:16.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:18.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:18.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:18 np0005588920 podman[311102]: 2026-01-20 15:39:18.974152055 +0000 UTC m=+0.053748456 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.045 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.389 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.756 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.757 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.757 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.758 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:39:19 np0005588920 nova_compute[226886]: 2026-01-20 15:39:19.758 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:39:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:39:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366126376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.232 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.371 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.372 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4159MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.372 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.373 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.445 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.445 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.466 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.489 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.490 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.504 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.530 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:39:20 np0005588920 nova_compute[226886]: 2026-01-20 15:39:20.551 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:39:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:20.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:20.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:39:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3640117576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:39:21 np0005588920 nova_compute[226886]: 2026-01-20 15:39:21.012 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:39:21 np0005588920 nova_compute[226886]: 2026-01-20 15:39:21.017 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:39:21 np0005588920 nova_compute[226886]: 2026-01-20 15:39:21.046 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:39:21 np0005588920 nova_compute[226886]: 2026-01-20 15:39:21.047 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:39:21 np0005588920 nova_compute[226886]: 2026-01-20 15:39:21.048 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:22.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:22.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.047 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.048 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.049 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.049 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.075 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:39:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:24 np0005588920 nova_compute[226886]: 2026-01-20 15:39:24.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:24.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:24.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:25 np0005588920 nova_compute[226886]: 2026-01-20 15:39:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:26.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:26.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:27 np0005588920 nova_compute[226886]: 2026-01-20 15:39:27.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:28 np0005588920 nova_compute[226886]: 2026-01-20 15:39:28.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:28.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:28.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:29 np0005588920 nova_compute[226886]: 2026-01-20 15:39:29.051 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:29 np0005588920 nova_compute[226886]: 2026-01-20 15:39:29.392 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:29 np0005588920 nova_compute[226886]: 2026-01-20 15:39:29.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:30.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:30.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:31 np0005588920 nova_compute[226886]: 2026-01-20 15:39:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:32.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:32.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:33 np0005588920 nova_compute[226886]: 2026-01-20 15:39:33.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.054 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.409 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.726 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.726 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.727 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.728 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.728 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.728 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.761 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.773 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.773 226890 WARNING nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.773 226890 WARNING nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.773 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Removable base files: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.774 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.774 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.774 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.774 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.774 226890 DEBUG nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 20 10:39:34 np0005588920 nova_compute[226886]: 2026-01-20 15:39:34.775 226890 INFO nova.virt.libvirt.imagecache [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 20 10:39:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:34.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:34.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:36.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:36.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:37 np0005588920 nova_compute[226886]: 2026-01-20 15:39:37.774 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:37 np0005588920 nova_compute[226886]: 2026-01-20 15:39:37.774 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:39:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:38.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:38.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:39 np0005588920 nova_compute[226886]: 2026-01-20 15:39:39.102 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:39 np0005588920 nova_compute[226886]: 2026-01-20 15:39:39.411 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:39 np0005588920 nova_compute[226886]: 2026-01-20 15:39:39.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:40 np0005588920 podman[311322]: 2026-01-20 15:39:40.524925619 +0000 UTC m=+0.072654262 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 20 10:39:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:40.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:40.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:42.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:42.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:44 np0005588920 nova_compute[226886]: 2026-01-20 15:39:44.145 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:44 np0005588920 nova_compute[226886]: 2026-01-20 15:39:44.412 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:44.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:44.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:46.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:46.956 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:39:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:46.957 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:39:46 np0005588920 nova_compute[226886]: 2026-01-20 15:39:46.957 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:46.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.475702) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588475731, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 1262, "num_deletes": 255, "total_data_size": 2799260, "memory_usage": 2841936, "flush_reason": "Manual Compaction"}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588514735, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 1825659, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84625, "largest_seqno": 85882, "table_properties": {"data_size": 1820203, "index_size": 2851, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11654, "raw_average_key_size": 19, "raw_value_size": 1809208, "raw_average_value_size": 3025, "num_data_blocks": 127, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923485, "oldest_key_time": 1768923485, "file_creation_time": 1768923588, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 39097 microseconds, and 4456 cpu microseconds.
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.514796) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 1825659 bytes OK
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.514812) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516852) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516902) EVENT_LOG_v1 {"time_micros": 1768923588516893, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516927) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 2793243, prev total WAL file size 2793243, number of live WAL files 2.
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.517882) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323733' seq:72057594037927935, type:22 .. '6C6F676D0033353234' seq:0, type:0; will stop at (end)
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(1782KB)], [174(10MB)]
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588517915, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 12699299, "oldest_snapshot_seqno": -1}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10482 keys, 12578277 bytes, temperature: kUnknown
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588640944, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 12578277, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512223, "index_size": 38741, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26245, "raw_key_size": 277393, "raw_average_key_size": 26, "raw_value_size": 12330286, "raw_average_value_size": 1176, "num_data_blocks": 1471, "num_entries": 10482, "num_filter_entries": 10482, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923588, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.641218) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 12578277 bytes
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.643170) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.2 rd, 102.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.4 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.8) write-amplify(6.9) OK, records in: 11005, records dropped: 523 output_compression: NoCompression
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.643207) EVENT_LOG_v1 {"time_micros": 1768923588643182, "job": 112, "event": "compaction_finished", "compaction_time_micros": 123100, "compaction_time_cpu_micros": 31125, "output_level": 6, "num_output_files": 1, "total_output_size": 12578277, "num_input_records": 11005, "num_output_records": 10482, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588643611, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588645655, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.517804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:48.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:39:48.958 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:39:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:48.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:49 np0005588920 nova_compute[226886]: 2026-01-20 15:39:49.148 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:49 np0005588920 nova_compute[226886]: 2026-01-20 15:39:49.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:49 np0005588920 nova_compute[226886]: 2026-01-20 15:39:49.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:49 np0005588920 podman[311375]: 2026-01-20 15:39:49.968418704 +0000 UTC m=+0.059775717 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 20 10:39:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:50.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:50.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:52.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:52.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:54 np0005588920 nova_compute[226886]: 2026-01-20 15:39:54.150 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:54 np0005588920 nova_compute[226886]: 2026-01-20 15:39:54.415 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:54.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:54.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:56.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:56.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:58.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:39:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:59.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:59 np0005588920 nova_compute[226886]: 2026-01-20 15:39:59.192 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:59 np0005588920 nova_compute[226886]: 2026-01-20 15:39:59.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:00 np0005588920 nova_compute[226886]: 2026-01-20 15:40:00.752 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:00 np0005588920 nova_compute[226886]: 2026-01-20 15:40:00.753 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:40:00 np0005588920 nova_compute[226886]: 2026-01-20 15:40:00.778 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:40:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:40:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:00.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:01.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:02 np0005588920 nova_compute[226886]: 2026-01-20 15:40:02.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:02 np0005588920 nova_compute[226886]: 2026-01-20 15:40:02.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:40:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:02.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:03.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:04 np0005588920 nova_compute[226886]: 2026-01-20 15:40:04.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:04 np0005588920 nova_compute[226886]: 2026-01-20 15:40:04.419 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:04.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:06.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:07.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:08.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:09.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:09 np0005588920 nova_compute[226886]: 2026-01-20 15:40:09.244 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:09 np0005588920 nova_compute[226886]: 2026-01-20 15:40:09.421 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:10.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:11.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:11 np0005588920 podman[311396]: 2026-01-20 15:40:11.037109428 +0000 UTC m=+0.114529081 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:40:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:12.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:13.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:40:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4116331290' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:40:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:40:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4116331290' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:40:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:14 np0005588920 nova_compute[226886]: 2026-01-20 15:40:14.293 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:14 np0005588920 nova_compute[226886]: 2026-01-20 15:40:14.422 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:14.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:15.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:16.504 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:16.505 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:16.505 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:16.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:17.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:18.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:19 np0005588920 nova_compute[226886]: 2026-01-20 15:40:19.296 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:19 np0005588920 nova_compute[226886]: 2026-01-20 15:40:19.424 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.762 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.807 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.808 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.808 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.809 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:40:20 np0005588920 nova_compute[226886]: 2026-01-20 15:40:20.809 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:20.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:20 np0005588920 podman[311423]: 2026-01-20 15:40:20.982697698 +0000 UTC m=+0.061772054 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 10:40:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:21.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:40:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3019188951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.300 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.456 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.458 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4149MB free_disk=20.94287872314453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.458 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.458 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.636 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.637 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:40:21 np0005588920 nova_compute[226886]: 2026-01-20 15:40:21.653 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:40:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4221299174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:40:22 np0005588920 nova_compute[226886]: 2026-01-20 15:40:22.200 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:22 np0005588920 nova_compute[226886]: 2026-01-20 15:40:22.206 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:40:22 np0005588920 nova_compute[226886]: 2026-01-20 15:40:22.223 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:40:22 np0005588920 nova_compute[226886]: 2026-01-20 15:40:22.225 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:40:22 np0005588920 nova_compute[226886]: 2026-01-20 15:40:22.225 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:22.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:23.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:24 np0005588920 nova_compute[226886]: 2026-01-20 15:40:24.336 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:24 np0005588920 nova_compute[226886]: 2026-01-20 15:40:24.426 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:24.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:25.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:26 np0005588920 nova_compute[226886]: 2026-01-20 15:40:26.188 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:26 np0005588920 nova_compute[226886]: 2026-01-20 15:40:26.189 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:40:26 np0005588920 nova_compute[226886]: 2026-01-20 15:40:26.189 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:40:26 np0005588920 nova_compute[226886]: 2026-01-20 15:40:26.205 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:40:26 np0005588920 nova_compute[226886]: 2026-01-20 15:40:26.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:26.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:27.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:28.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:29.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:29 np0005588920 nova_compute[226886]: 2026-01-20 15:40:29.339 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:29 np0005588920 nova_compute[226886]: 2026-01-20 15:40:29.428 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:29 np0005588920 nova_compute[226886]: 2026-01-20 15:40:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:30 np0005588920 nova_compute[226886]: 2026-01-20 15:40:30.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:30 np0005588920 nova_compute[226886]: 2026-01-20 15:40:30.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:31 np0005588920 nova_compute[226886]: 2026-01-20 15:40:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:32.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:33.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:34 np0005588920 nova_compute[226886]: 2026-01-20 15:40:34.343 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:34 np0005588920 nova_compute[226886]: 2026-01-20 15:40:34.432 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:35.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:35 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:35.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:38 np0005588920 nova_compute[226886]: 2026-01-20 15:40:38.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:38 np0005588920 nova_compute[226886]: 2026-01-20 15:40:38.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:40:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:39.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:39.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:39 np0005588920 nova_compute[226886]: 2026-01-20 15:40:39.346 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:39 np0005588920 nova_compute[226886]: 2026-01-20 15:40:39.435 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:39 np0005588920 nova_compute[226886]: 2026-01-20 15:40:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:41 np0005588920 podman[311631]: 2026-01-20 15:40:41.162139081 +0000 UTC m=+0.085355002 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:40:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:41.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:41 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:41.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:40:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:43 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:43.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:43.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:44 np0005588920 nova_compute[226886]: 2026-01-20 15:40:44.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588920 nova_compute[226886]: 2026-01-20 15:40:44.436 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:45.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:45.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:47.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:47.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:48 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:49.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:49.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:49 np0005588920 nova_compute[226886]: 2026-01-20 15:40:49.353 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:49 np0005588920 nova_compute[226886]: 2026-01-20 15:40:49.438 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:51.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:51 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:51.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:51 np0005588920 nova_compute[226886]: 2026-01-20 15:40:51.349 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:51.349 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:40:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:51.350 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:40:51 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:40:51.350 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:51 np0005588920 podman[311813]: 2026-01-20 15:40:51.958869635 +0000 UTC m=+0.047176060 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:40:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:53 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:54 np0005588920 nova_compute[226886]: 2026-01-20 15:40:54.413 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:54 np0005588920 nova_compute[226886]: 2026-01-20 15:40:54.440 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:54 np0005588920 nova_compute[226886]: 2026-01-20 15:40:54.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:55.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:55 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:55.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:57.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:57.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:40:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:59.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:40:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:59 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:59.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:59 np0005588920 nova_compute[226886]: 2026-01-20 15:40:59.416 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:59 np0005588920 nova_compute[226886]: 2026-01-20 15:40:59.442 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:03 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:04 np0005588920 nova_compute[226886]: 2026-01-20 15:41:04.419 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:04 np0005588920 nova_compute[226886]: 2026-01-20 15:41:04.443 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:05.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:05 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:05.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:07.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:07 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:07.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:09.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:09 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:09.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.445 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.447 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:09 np0005588920 nova_compute[226886]: 2026-01-20 15:41:09.466 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:11.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:11 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:11.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:11 np0005588920 podman[311832]: 2026-01-20 15:41:11.991342227 +0000 UTC m=+0.078906880 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:41:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:41:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:13 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:13.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:13.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:14 np0005588920 nova_compute[226886]: 2026-01-20 15:41:14.467 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:15.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:15.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:16.505 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:16.505 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:16.506 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:17.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:17.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:19.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:19.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:19 np0005588920 nova_compute[226886]: 2026-01-20 15:41:19.469 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.758 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.759 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.759 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:41:20 np0005588920 nova_compute[226886]: 2026-01-20 15:41:20.760 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:41:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:41:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/552745251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.205 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:41:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:21.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:21.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.366 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.367 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4132MB free_disk=20.916587829589844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.367 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.367 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.443 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.444 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.464 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:41:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:41:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1289347087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.938 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.944 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.967 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.969 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:41:21 np0005588920 nova_compute[226886]: 2026-01-20 15:41:21.969 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:22 np0005588920 podman[311904]: 2026-01-20 15:41:22.973963934 +0000 UTC m=+0.052055198 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:41:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:23.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:23.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:24 np0005588920 nova_compute[226886]: 2026-01-20 15:41:24.472 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:25.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:25.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:25 np0005588920 nova_compute[226886]: 2026-01-20 15:41:25.969 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:25 np0005588920 nova_compute[226886]: 2026-01-20 15:41:25.970 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:41:25 np0005588920 nova_compute[226886]: 2026-01-20 15:41:25.970 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:41:25 np0005588920 nova_compute[226886]: 2026-01-20 15:41:25.994 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:41:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:27.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:27.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:28 np0005588920 nova_compute[226886]: 2026-01-20 15:41:28.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:29.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:29.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:29 np0005588920 nova_compute[226886]: 2026-01-20 15:41:29.473 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:29 np0005588920 nova_compute[226886]: 2026-01-20 15:41:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:31.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:31.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:31 np0005588920 nova_compute[226886]: 2026-01-20 15:41:31.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:31 np0005588920 nova_compute[226886]: 2026-01-20 15:41:31.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:31 np0005588920 nova_compute[226886]: 2026-01-20 15:41:31.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:33.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:33.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.476 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.478 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.478 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.478 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.514 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:34 np0005588920 nova_compute[226886]: 2026-01-20 15:41:34.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:35.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:35.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:37.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:37.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:39.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:39.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:39 np0005588920 nova_compute[226886]: 2026-01-20 15:41:39.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:39 np0005588920 nova_compute[226886]: 2026-01-20 15:41:39.517 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:39 np0005588920 nova_compute[226886]: 2026-01-20 15:41:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:39 np0005588920 nova_compute[226886]: 2026-01-20 15:41:39.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:41:40 np0005588920 nova_compute[226886]: 2026-01-20 15:41:40.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:41.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.519584) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 20 10:41:41 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701519614, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1347, "num_deletes": 251, "total_data_size": 3039612, "memory_usage": 3068112, "flush_reason": "Manual Compaction"}
Jan 20 10:41:41 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701529957, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 2006340, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85887, "largest_seqno": 87229, "table_properties": {"data_size": 2000478, "index_size": 3192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12603, "raw_average_key_size": 20, "raw_value_size": 1988788, "raw_average_value_size": 3176, "num_data_blocks": 139, "num_entries": 626, "num_filter_entries": 626, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923588, "oldest_key_time": 1768923588, "file_creation_time": 1768923701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 10409 microseconds, and 4514 cpu microseconds.
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.529992) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 2006340 bytes OK
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.530006) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.531429) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.531439) EVENT_LOG_v1 {"time_micros": 1768923701531436, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.531453) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 3033280, prev total WAL file size 3033280, number of live WAL files 2.
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.532087) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(1959KB)], [177(11MB)]
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701532112, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 14584617, "oldest_snapshot_seqno": -1}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10589 keys, 12624631 bytes, temperature: kUnknown
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701595494, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 12624631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12557841, "index_size": 39221, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26501, "raw_key_size": 280318, "raw_average_key_size": 26, "raw_value_size": 12374026, "raw_average_value_size": 1168, "num_data_blocks": 1485, "num_entries": 10589, "num_filter_entries": 10589, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.595788) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 12624631 bytes
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.616432) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 229.7 rd, 198.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.0 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 11108, records dropped: 519 output_compression: NoCompression
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.616462) EVENT_LOG_v1 {"time_micros": 1768923701616450, "job": 114, "event": "compaction_finished", "compaction_time_micros": 63497, "compaction_time_cpu_micros": 28388, "output_level": 6, "num_output_files": 1, "total_output_size": 12624631, "num_input_records": 11108, "num_output_records": 10589, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701616846, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701619018, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.532058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:42 np0005588920 podman[311925]: 2026-01-20 15:41:42.997088171 +0000 UTC m=+0.085702503 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:41:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:43.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:43.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.518 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.519 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.520 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:41:44 np0005588920 nova_compute[226886]: 2026-01-20 15:41:44.521 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:45.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:45.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:47.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:47.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:49.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:49.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:49 np0005588920 nova_compute[226886]: 2026-01-20 15:41:49.522 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:41:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:41:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:41:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:51.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:51.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:53.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:53.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:54 np0005588920 podman[312083]: 2026-01-20 15:41:54.015026779 +0000 UTC m=+0.091707073 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:41:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:54 np0005588920 nova_compute[226886]: 2026-01-20 15:41:54.523 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:41:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:55.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:55 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:57.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:57.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:57.543 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:41:57 np0005588920 nova_compute[226886]: 2026-01-20 15:41:57.544 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:57.545 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:41:57 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:41:57.546 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:41:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:59.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:41:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:59.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:59 np0005588920 nova_compute[226886]: 2026-01-20 15:41:59.526 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:01.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:01.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:03.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:03.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.527 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.529 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.558 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:04 np0005588920 nova_compute[226886]: 2026-01-20 15:42:04.559 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:05.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:05.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:07.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:07.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:09.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:09.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.560 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:09 np0005588920 nova_compute[226886]: 2026-01-20 15:42:09.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:11.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:11.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:13.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:13.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:14 np0005588920 podman[312154]: 2026-01-20 15:42:14.024771625 +0000 UTC m=+0.116579579 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:42:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:14 np0005588920 nova_compute[226886]: 2026-01-20 15:42:14.561 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:15.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:15.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:16.506 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:16.507 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:16.507 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:17.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:17.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:19.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:19.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:19 np0005588920 nova_compute[226886]: 2026-01-20 15:42:19.564 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:21.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.850 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.851 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.851 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.852 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:42:22 np0005588920 nova_compute[226886]: 2026-01-20 15:42:22.852 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:42:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1871014464' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.287 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:23.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:23.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.486 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.487 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4135MB free_disk=20.942916870117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.488 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.488 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.653 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.654 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:42:23 np0005588920 nova_compute[226886]: 2026-01-20 15:42:23.723 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:42:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1603594978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.187 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.195 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.218 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.220 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.220 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:24 np0005588920 nova_compute[226886]: 2026-01-20 15:42:24.565 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:24 np0005588920 podman[312225]: 2026-01-20 15:42:24.961699286 +0000 UTC m=+0.051456511 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 20 10:42:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:25.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:25 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:25.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:26 np0005588920 nova_compute[226886]: 2026-01-20 15:42:26.220 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:26 np0005588920 nova_compute[226886]: 2026-01-20 15:42:26.221 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:42:26 np0005588920 nova_compute[226886]: 2026-01-20 15:42:26.222 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:42:26 np0005588920 nova_compute[226886]: 2026-01-20 15:42:26.549 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:42:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:27.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:27.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:28 np0005588920 nova_compute[226886]: 2026-01-20 15:42:28.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:29.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:29 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:29.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.568 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.569 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:29 np0005588920 nova_compute[226886]: 2026-01-20 15:42:29.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:31 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:31.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:31.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:31 np0005588920 nova_compute[226886]: 2026-01-20 15:42:31.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:31 np0005588920 nova_compute[226886]: 2026-01-20 15:42:31.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:32 np0005588920 nova_compute[226886]: 2026-01-20 15:42:32.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:33 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:34 np0005588920 nova_compute[226886]: 2026-01-20 15:42:34.572 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:35.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:35.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:37.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:37 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:37.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:39.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:39 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:39.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:39 np0005588920 nova_compute[226886]: 2026-01-20 15:42:39.574 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:39 np0005588920 nova_compute[226886]: 2026-01-20 15:42:39.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:39 np0005588920 nova_compute[226886]: 2026-01-20 15:42:39.724 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:39.792 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:42:39 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:39.792 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:42:39 np0005588920 nova_compute[226886]: 2026-01-20 15:42:39.793 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:41.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:41 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:41.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:41 np0005588920 nova_compute[226886]: 2026-01-20 15:42:41.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:43.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:43 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:43.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:44 np0005588920 nova_compute[226886]: 2026-01-20 15:42:44.575 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:42:44.795 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:44 np0005588920 podman[312246]: 2026-01-20 15:42:44.990219265 +0000 UTC m=+0.079695662 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:42:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:45.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:45.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:47.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:47.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:49.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:49.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:49 np0005588920 nova_compute[226886]: 2026-01-20 15:42:49.577 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:51.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:53.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:54 np0005588920 nova_compute[226886]: 2026-01-20 15:42:54.578 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:55.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:55.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:55 np0005588920 podman[312297]: 2026-01-20 15:42:55.506314686 +0000 UTC m=+0.048899969 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:42:56 np0005588920 podman[312463]: 2026-01-20 15:42:56.110315174 +0000 UTC m=+0.056008910 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Jan 20 10:42:56 np0005588920 podman[312463]: 2026-01-20 15:42:56.201694477 +0000 UTC m=+0.147388193 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:42:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:57.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:42:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:42:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:59.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:42:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:59 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:59.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:59 np0005588920 nova_compute[226886]: 2026-01-20 15:42:59.581 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:01.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:01 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:01.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:03 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.585 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.586 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:43:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:43:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:43:04 np0005588920 nova_compute[226886]: 2026-01-20 15:43:04.588 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:05.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:05 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:05.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:07.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:07.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:09.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:09.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:09 np0005588920 nova_compute[226886]: 2026-01-20 15:43:09.586 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:11.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:11 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:11.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:13.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:13.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:14 np0005588920 nova_compute[226886]: 2026-01-20 15:43:14.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:15.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:15.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:15 np0005588920 podman[312767]: 2026-01-20 15:43:15.994500425 +0000 UTC m=+0.081316520 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 20 10:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:16.508 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:16.508 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:16.508 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:17.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:17.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:19.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:19 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:19.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:19 np0005588920 nova_compute[226886]: 2026-01-20 15:43:19.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:21.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:21.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:23.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:23.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.749 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.749 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:43:23 np0005588920 nova_compute[226886]: 2026-01-20 15:43:23.750 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1142949718' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.165 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.299 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.301 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4120MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.301 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.301 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.513 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.514 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.592 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:24 np0005588920 nova_compute[226886]: 2026-01-20 15:43:24.624 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/578300199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:25 np0005588920 nova_compute[226886]: 2026-01-20 15:43:25.114 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:25 np0005588920 nova_compute[226886]: 2026-01-20 15:43:25.120 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:43:25 np0005588920 nova_compute[226886]: 2026-01-20 15:43:25.146 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:43:25 np0005588920 nova_compute[226886]: 2026-01-20 15:43:25.147 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:43:25 np0005588920 nova_compute[226886]: 2026-01-20 15:43:25.148 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:25.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:25 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:25.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:25 np0005588920 ceph-mgr[77507]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 10:43:26 np0005588920 podman[312840]: 2026-01-20 15:43:26.019076527 +0000 UTC m=+0.095722121 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 20 10:43:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:27.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:27 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:27.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:28 np0005588920 nova_compute[226886]: 2026-01-20 15:43:28.148 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:28 np0005588920 nova_compute[226886]: 2026-01-20 15:43:28.148 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:43:28 np0005588920 nova_compute[226886]: 2026-01-20 15:43:28.148 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:43:28 np0005588920 nova_compute[226886]: 2026-01-20 15:43:28.181 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:43:28 np0005588920 nova_compute[226886]: 2026-01-20 15:43:28.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:29 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:29.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:29.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:29 np0005588920 nova_compute[226886]: 2026-01-20 15:43:29.595 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:29 np0005588920 nova_compute[226886]: 2026-01-20 15:43:29.627 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:30 np0005588920 nova_compute[226886]: 2026-01-20 15:43:30.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:31 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:31.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:31.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:31 np0005588920 nova_compute[226886]: 2026-01-20 15:43:31.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:32 np0005588920 nova_compute[226886]: 2026-01-20 15:43:32.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:33.221 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:43:33 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:33.222 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:43:33 np0005588920 nova_compute[226886]: 2026-01-20 15:43:33.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:33.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:33.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:34 np0005588920 nova_compute[226886]: 2026-01-20 15:43:34.597 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:34 np0005588920 nova_compute[226886]: 2026-01-20 15:43:34.628 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:34 np0005588920 nova_compute[226886]: 2026-01-20 15:43:34.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:34 np0005588920 nova_compute[226886]: 2026-01-20 15:43:34.736 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:35.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:35 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:35.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.459 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.460 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:37.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:37 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:37.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.477 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.587 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.588 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.593 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.594 226890 INFO nova.compute.claims [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:43:37 np0005588920 nova_compute[226886]: 2026-01-20 15:43:37.680 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1438658562' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.103 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.109 226890 DEBUG nova.compute.provider_tree [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.132 226890 DEBUG nova.scheduler.client.report [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.153 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.154 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:43:38 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:38.224 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.257 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.257 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.277 226890 INFO nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.295 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.387 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.389 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.389 226890 INFO nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Creating image(s)#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.417 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.441 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.465 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.469 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.531 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.532 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.533 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.533 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.560 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.564 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.893 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.329s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:38 np0005588920 nova_compute[226886]: 2026-01-20 15:43:38.968 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] resizing rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.078 226890 DEBUG nova.objects.instance [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'migration_context' on Instance uuid 0dd4c943-fb9b-42db-93ef-7199a7deaf1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.096 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.096 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Ensure instance console log exists: /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.097 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.097 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.097 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.142 226890 DEBUG nova.policy [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5985ef736503499a9f1d734cabc33ce5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:43:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:39.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:39 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:39.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.600 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:39 np0005588920 nova_compute[226886]: 2026-01-20 15:43:39.629 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:40 np0005588920 nova_compute[226886]: 2026-01-20 15:43:40.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:40 np0005588920 nova_compute[226886]: 2026-01-20 15:43:40.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:43:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:41.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:42 np0005588920 nova_compute[226886]: 2026-01-20 15:43:42.252 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Successfully created port: 0d47bc74-c077-42be-be2e-f197e1e4b5ab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:43:42 np0005588920 nova_compute[226886]: 2026-01-20 15:43:42.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:43.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.602 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.630 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.744 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Successfully updated port: 0d47bc74-c077-42be-be2e-f197e1e4b5ab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.767 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.768 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquired lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.768 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.838 226890 DEBUG nova.compute.manager [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.839 226890 DEBUG nova.compute.manager [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing instance network info cache due to event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:43:44 np0005588920 nova_compute[226886]: 2026-01-20 15:43:44.839 226890 DEBUG oslo_concurrency.lockutils [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.122 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:43:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:45.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:45 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:45.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.890 226890 DEBUG nova.network.neutron [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updating instance_info_cache with network_info: [{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.915 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Releasing lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.916 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Instance network_info: |[{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.917 226890 DEBUG oslo_concurrency.lockutils [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.918 226890 DEBUG nova.network.neutron [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.922 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Start _get_guest_xml network_info=[{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.929 226890 WARNING nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.934 226890 DEBUG nova.virt.libvirt.host [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.934 226890 DEBUG nova.virt.libvirt.host [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.942 226890 DEBUG nova.virt.libvirt.host [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.943 226890 DEBUG nova.virt.libvirt.host [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.945 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.945 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.946 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.946 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.946 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.947 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.947 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.948 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.948 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.948 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.949 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.949 226890 DEBUG nova.virt.hardware [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:43:45 np0005588920 nova_compute[226886]: 2026-01-20 15:43:45.953 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:43:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/697914505' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.420 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.444 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.449 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:46 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:43:46 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3464045524' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.887 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.889 226890 DEBUG nova.virt.libvirt.vif [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:43:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=220,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoz2GbOw8ow3DYb4rLFEuMWflz3gP9xv+U5mMCoX7iaUGmWyARnqdL7NG+heU2Zm874PRTc/Yh5PA/F6qY/4DIo5Ys6G6QNvcpeG1KGrWoDYICd41OXW09TeySSA+suuw==',key_name='tempest-TestSecurityGroupsBasicOps-990929955',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-xg0r85nd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:43:38Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=0dd4c943-fb9b-42db-93ef-7199a7deaf1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.889 226890 DEBUG nova.network.os_vif_util [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.890 226890 DEBUG nova.network.os_vif_util [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.890 226890 DEBUG nova.objects.instance [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'pci_devices' on Instance uuid 0dd4c943-fb9b-42db-93ef-7199a7deaf1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.908 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <uuid>0dd4c943-fb9b-42db-93ef-7199a7deaf1d</uuid>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <name>instance-000000dc</name>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667</nova:name>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:43:45</nova:creationTime>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:user uuid="5985ef736503499a9f1d734cabc33ce5">tempest-TestSecurityGroupsBasicOps-342561427-project-member</nova:user>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:project uuid="728662ec7f654a3fb2e53a90b8707d7e">tempest-TestSecurityGroupsBasicOps-342561427</nova:project>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <nova:port uuid="0d47bc74-c077-42be-be2e-f197e1e4b5ab">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="serial">0dd4c943-fb9b-42db-93ef-7199a7deaf1d</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="uuid">0dd4c943-fb9b-42db-93ef-7199a7deaf1d</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:1f:89:ea"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <target dev="tap0d47bc74-c0"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/console.log" append="off"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:43:46 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:43:46 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:43:46 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:43:46 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.908 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Preparing to wait for external event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.909 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.909 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.909 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.910 226890 DEBUG nova.virt.libvirt.vif [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:43:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=220,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoz2GbOw8ow3DYb4rLFEuMWflz3gP9xv+U5mMCoX7iaUGmWyARnqdL7NG+heU2Zm874PRTc/Yh5PA/F6qY/4DIo5Ys6G6QNvcpeG1KGrWoDYICd41OXW09TeySSA+suuw==',key_name='tempest-TestSecurityGroupsBasicOps-990929955',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-xg0r85nd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:43:38Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=0dd4c943-fb9b-42db-93ef-7199a7deaf1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.910 226890 DEBUG nova.network.os_vif_util [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.910 226890 DEBUG nova.network.os_vif_util [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.911 226890 DEBUG os_vif [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.911 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.911 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.912 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.915 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.916 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d47bc74-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.917 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d47bc74-c0, col_values=(('external_ids', {'iface-id': '0d47bc74-c077-42be-be2e-f197e1e4b5ab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:89:ea', 'vm-uuid': '0dd4c943-fb9b-42db-93ef-7199a7deaf1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:46 np0005588920 NetworkManager[49076]: <info>  [1768923826.9759] manager: (tap0d47bc74-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/462)
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.975 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.978 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.983 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:46 np0005588920 nova_compute[226886]: 2026-01-20 15:43:46.984 226890 INFO os_vif [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0')#033[00m
Jan 20 10:43:47 np0005588920 podman[313111]: 2026-01-20 15:43:47.038280724 +0000 UTC m=+0.130887744 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.061 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.061 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.062 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No VIF found with MAC fa:16:3e:1f:89:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.062 226890 INFO nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Using config drive#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.089 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.095 226890 DEBUG nova.network.neutron [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updated VIF entry in instance network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.095 226890 DEBUG nova.network.neutron [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updating instance_info_cache with network_info: [{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.129 226890 DEBUG oslo_concurrency.lockutils [req-e766761a-d26b-41ba-a647-a613acca2a86 req-ced7ea14-beaa-4883-b2c2-633fcb8f8396 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:43:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:47.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:47.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.764 226890 INFO nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Creating config drive at /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.769 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2hynfmi5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.904 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2hynfmi5" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.929 226890 DEBUG nova.storage.rbd_utils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:43:47 np0005588920 nova_compute[226886]: 2026-01-20 15:43:47.932 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.186 226890 DEBUG oslo_concurrency.processutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config 0dd4c943-fb9b-42db-93ef-7199a7deaf1d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.254s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.187 226890 INFO nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Deleting local config drive /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d/disk.config because it was imported into RBD.#033[00m
Jan 20 10:43:48 np0005588920 kernel: tap0d47bc74-c0: entered promiscuous mode
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.2313] manager: (tap0d47bc74-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/463)
Jan 20 10:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:48Z|00981|binding|INFO|Claiming lport 0d47bc74-c077-42be-be2e-f197e1e4b5ab for this chassis.
Jan 20 10:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:48Z|00982|binding|INFO|0d47bc74-c077-42be-be2e-f197e1e4b5ab: Claiming fa:16:3e:1f:89:ea 10.100.0.13
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.246 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.253 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.258 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:89:ea 10.100.0.13'], port_security=['fa:16:3e:1f:89:ea 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0dd4c943-fb9b-42db-93ef-7199a7deaf1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6ab77576-28bb-49fd-83cd-f932fc142f3b a5ab9ca2-6ab4-4076-a615-8e84fc304af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca4ccda-af2d-4a1b-9226-78b0c8cd522f, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0d47bc74-c077-42be-be2e-f197e1e4b5ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.259 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0d47bc74-c077-42be-be2e-f197e1e4b5ab in datapath b3454bfb-211a-4d3c-9cdd-2add870f0bc5 bound to our chassis#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.260 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3454bfb-211a-4d3c-9cdd-2add870f0bc5#033[00m
Jan 20 10:43:48 np0005588920 systemd-udevd[313210]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.270 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[962ac324-de7b-4c23-813e-22ec9155dbb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.271 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3454bfb-21 in ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.273 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3454bfb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.273 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e355256f-b667-4a3b-a72c-1c2d8731aa76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.274 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8521b1-57ba-4399-957f-acd58757903f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.2790] device (tap0d47bc74-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.2802] device (tap0d47bc74-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:43:48 np0005588920 systemd-machined[196121]: New machine qemu-101-instance-000000dc.
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.285 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[80c98f16-5343-4aef-86eb-3d178a2911ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.310 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[131ea366-87cd-41ad-b1a3-14d7be70250f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 systemd[1]: Started Virtual Machine qemu-101-instance-000000dc.
Jan 20 10:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:48Z|00983|binding|INFO|Setting lport 0d47bc74-c077-42be-be2e-f197e1e4b5ab ovn-installed in OVS
Jan 20 10:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:48Z|00984|binding|INFO|Setting lport 0d47bc74-c077-42be-be2e-f197e1e4b5ab up in Southbound
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.328 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.338 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[27b0dc8a-886d-44fb-809e-db9f5f3dac64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.3446] manager: (tapb3454bfb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/464)
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.343 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[0d26c82d-c417-432d-b127-3e022a6dc4df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.375 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[664adf82-b4ef-43f8-b39e-2e79a4001a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.378 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[1d57afc7-a676-401d-8e0c-9591596fb3cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.3998] device (tapb3454bfb-20): carrier: link connected
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.406 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[8a297c01-ba06-4d24-9823-3c766b610499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.421 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[840d7470-3d3c-49b1-bfca-6aa39fe2e427]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3454bfb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:ca:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 307], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887008, 'reachable_time': 21834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313244, 'error': None, 'target': 'ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.435 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[463345ad-b8d1-4dad-a77d-9e138aa297a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:ca2e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887008, 'tstamp': 887008}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 313245, 'error': None, 'target': 'ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.450 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e0484365-ebb3-4bbc-8580-9496ea1b4a10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3454bfb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:ca:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 307], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887008, 'reachable_time': 21834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 313246, 'error': None, 'target': 'ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.479 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e6781dd5-a77c-4adc-b36e-cc3bb3920a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.531 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[87cbf036-08d1-4cd2-81b8-bdb63ec1a26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.532 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3454bfb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.533 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.533 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3454bfb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:48 np0005588920 NetworkManager[49076]: <info>  [1768923828.5357] manager: (tapb3454bfb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/465)
Jan 20 10:43:48 np0005588920 kernel: tapb3454bfb-20: entered promiscuous mode
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.536 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.537 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3454bfb-20, col_values=(('external_ids', {'iface-id': '9d7a020f-3805-4580-8fce-cdd48408641a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:48 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:48Z|00985|binding|INFO|Releasing lport 9d7a020f-3805-4580-8fce-cdd48408641a from this chassis (sb_readonly=0)
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.539 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.552 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.552 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3454bfb-211a-4d3c-9cdd-2add870f0bc5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3454bfb-211a-4d3c-9cdd-2add870f0bc5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.553 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3e19af68-b07d-4e99-a33a-f83fe142be10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.554 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-b3454bfb-211a-4d3c-9cdd-2add870f0bc5
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/b3454bfb-211a-4d3c-9cdd-2add870f0bc5.pid.haproxy
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID b3454bfb-211a-4d3c-9cdd-2add870f0bc5
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:43:48 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:43:48.556 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'env', 'PROCESS_TAG=haproxy-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3454bfb-211a-4d3c-9cdd-2add870f0bc5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.810 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923828.810104, 0dd4c943-fb9b-42db-93ef-7199a7deaf1d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.811 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] VM Started (Lifecycle Event)#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.830 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.834 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923828.810318, 0dd4c943-fb9b-42db-93ef-7199a7deaf1d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.834 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.856 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.859 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:43:48 np0005588920 nova_compute[226886]: 2026-01-20 15:43:48.877 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:43:48 np0005588920 podman[313319]: 2026-01-20 15:43:48.90543208 +0000 UTC m=+0.051982374 container create 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:43:48 np0005588920 systemd[1]: Started libpod-conmon-81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d.scope.
Jan 20 10:43:48 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:43:48 np0005588920 podman[313319]: 2026-01-20 15:43:48.875823075 +0000 UTC m=+0.022373419 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:43:48 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/582c3dc9400aa1947847fd719984c55c76db3a04746e2dc2ffa11ae707d1ffd9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:43:48 np0005588920 podman[313319]: 2026-01-20 15:43:48.997013112 +0000 UTC m=+0.143563426 container init 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:43:49 np0005588920 podman[313319]: 2026-01-20 15:43:49.002755866 +0000 UTC m=+0.149306160 container start 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:43:49 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [NOTICE]   (313338) : New worker (313340) forked
Jan 20 10:43:49 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [NOTICE]   (313338) : Loading success.
Jan 20 10:43:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.397 226890 DEBUG nova.compute.manager [req-6ed5ca7b-bf54-4bf4-86aa-c0fe1f2184af req-17519a4d-2321-4836-a080-6ba4ea4b0d74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.398 226890 DEBUG oslo_concurrency.lockutils [req-6ed5ca7b-bf54-4bf4-86aa-c0fe1f2184af req-17519a4d-2321-4836-a080-6ba4ea4b0d74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.398 226890 DEBUG oslo_concurrency.lockutils [req-6ed5ca7b-bf54-4bf4-86aa-c0fe1f2184af req-17519a4d-2321-4836-a080-6ba4ea4b0d74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.399 226890 DEBUG oslo_concurrency.lockutils [req-6ed5ca7b-bf54-4bf4-86aa-c0fe1f2184af req-17519a4d-2321-4836-a080-6ba4ea4b0d74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.399 226890 DEBUG nova.compute.manager [req-6ed5ca7b-bf54-4bf4-86aa-c0fe1f2184af req-17519a4d-2321-4836-a080-6ba4ea4b0d74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Processing event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.401 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.405 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923829.4056249, 0dd4c943-fb9b-42db-93ef-7199a7deaf1d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.406 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.408 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.411 226890 INFO nova.virt.libvirt.driver [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Instance spawned successfully.#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.412 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.424 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.430 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.436 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.436 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.437 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.437 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.438 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.438 226890 DEBUG nova.virt.libvirt.driver [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.450 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:43:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:49.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:49.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.487 226890 INFO nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Took 11.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.488 226890 DEBUG nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.558 226890 INFO nova.compute.manager [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Took 12.00 seconds to build instance.#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.575 226890 DEBUG oslo_concurrency.lockutils [None req-94bcd176-4e3a-4b22-ad8e-ae38caedc27a 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:49 np0005588920 nova_compute[226886]: 2026-01-20 15:43:49.642 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.476 226890 DEBUG nova.compute.manager [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.477 226890 DEBUG oslo_concurrency.lockutils [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.477 226890 DEBUG oslo_concurrency.lockutils [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.477 226890 DEBUG oslo_concurrency.lockutils [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.477 226890 DEBUG nova.compute.manager [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] No waiting events found dispatching network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:43:51 np0005588920 nova_compute[226886]: 2026-01-20 15:43:51.478 226890 WARNING nova.compute.manager [req-9f1b75c4-c169-424d-b454-4f324f717a3f req-39098f5b-8bd0-4765-91f3-bc5c2c323f4d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received unexpected event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab for instance with vm_state active and task_state None.#033[00m
Jan 20 10:43:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:51.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:51.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:52 np0005588920 nova_compute[226886]: 2026-01-20 15:43:52.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:53.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:53 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:53.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:54Z|00986|binding|INFO|Releasing lport 9d7a020f-3805-4580-8fce-cdd48408641a from this chassis (sb_readonly=0)
Jan 20 10:43:54 np0005588920 NetworkManager[49076]: <info>  [1768923834.5641] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/466)
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:54 np0005588920 NetworkManager[49076]: <info>  [1768923834.5658] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/467)
Jan 20 10:43:54 np0005588920 ovn_controller[133971]: 2026-01-20T15:43:54Z|00987|binding|INFO|Releasing lport 9d7a020f-3805-4580-8fce-cdd48408641a from this chassis (sb_readonly=0)
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.567 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.567 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.645 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.819 226890 DEBUG nova.compute.manager [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.819 226890 DEBUG nova.compute.manager [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing instance network info cache due to event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.819 226890 DEBUG oslo_concurrency.lockutils [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.819 226890 DEBUG oslo_concurrency.lockutils [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:43:54 np0005588920 nova_compute[226886]: 2026-01-20 15:43:54.820 226890 DEBUG nova.network.neutron [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:43:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:55.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:55.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:56 np0005588920 nova_compute[226886]: 2026-01-20 15:43:56.522 226890 DEBUG nova.network.neutron [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updated VIF entry in instance network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:43:56 np0005588920 nova_compute[226886]: 2026-01-20 15:43:56.523 226890 DEBUG nova.network.neutron [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updating instance_info_cache with network_info: [{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:43:56 np0005588920 nova_compute[226886]: 2026-01-20 15:43:56.551 226890 DEBUG oslo_concurrency.lockutils [req-fd7ee5af-c1f8-4b85-81ad-4851792b3adc req-31c6044d-2777-4817-b36e-bb8a339efce0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:43:56 np0005588920 podman[313350]: 2026-01-20 15:43:56.994293608 +0000 UTC m=+0.068799074 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:43:57 np0005588920 nova_compute[226886]: 2026-01-20 15:43:57.064 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:57 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:57.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:57.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:43:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:43:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:43:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:59.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:43:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:43:59 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:59.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:43:59 np0005588920 nova_compute[226886]: 2026-01-20 15:43:59.691 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:01.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:01.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:02 np0005588920 nova_compute[226886]: 2026-01-20 15:44:02.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:02Z|00139|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:89:ea 10.100.0.13
Jan 20 10:44:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:02Z|00140|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:89:ea 10.100.0.13
Jan 20 10:44:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:44:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:03 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:03.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:03.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:04 np0005588920 nova_compute[226886]: 2026-01-20 15:44:04.730 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:05.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:44:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:05 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:05.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:44:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:05 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:44:07 np0005588920 nova_compute[226886]: 2026-01-20 15:44:07.160 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:07.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:07.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:09.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:09.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:09 np0005588920 nova_compute[226886]: 2026-01-20 15:44:09.734 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:11.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:11 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:12 np0005588920 nova_compute[226886]: 2026-01-20 15:44:12.164 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:13.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:13.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.371 226890 DEBUG nova.compute.manager [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.372 226890 DEBUG nova.compute.manager [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing instance network info cache due to event network-changed-0d47bc74-c077-42be-be2e-f197e1e4b5ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.372 226890 DEBUG oslo_concurrency.lockutils [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.372 226890 DEBUG oslo_concurrency.lockutils [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.372 226890 DEBUG nova.network.neutron [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Refreshing network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.551 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.552 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.552 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.552 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.553 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.554 226890 INFO nova.compute.manager [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Terminating instance#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.555 226890 DEBUG nova.compute.manager [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:44:14 np0005588920 kernel: tap0d47bc74-c0 (unregistering): left promiscuous mode
Jan 20 10:44:14 np0005588920 NetworkManager[49076]: <info>  [1768923854.6106] device (tap0d47bc74-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.649 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:14Z|00988|binding|INFO|Releasing lport 0d47bc74-c077-42be-be2e-f197e1e4b5ab from this chassis (sb_readonly=0)
Jan 20 10:44:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:14Z|00989|binding|INFO|Setting lport 0d47bc74-c077-42be-be2e-f197e1e4b5ab down in Southbound
Jan 20 10:44:14 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:14Z|00990|binding|INFO|Removing iface tap0d47bc74-c0 ovn-installed in OVS
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.652 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.660 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:89:ea 10.100.0.13'], port_security=['fa:16:3e:1f:89:ea 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0dd4c943-fb9b-42db-93ef-7199a7deaf1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ab77576-28bb-49fd-83cd-f932fc142f3b a5ab9ca2-6ab4-4076-a615-8e84fc304af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca4ccda-af2d-4a1b-9226-78b0c8cd522f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=0d47bc74-c077-42be-be2e-f197e1e4b5ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.661 144128 INFO neutron.agent.ovn.metadata.agent [-] Port 0d47bc74-c077-42be-be2e-f197e1e4b5ab in datapath b3454bfb-211a-4d3c-9cdd-2add870f0bc5 unbound from our chassis#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.663 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3454bfb-211a-4d3c-9cdd-2add870f0bc5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.664 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.664 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3675f6cc-b0b5-42dc-9a8a-c628c258a458]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.665 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5 namespace which is not needed anymore#033[00m
Jan 20 10:44:14 np0005588920 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000dc.scope: Deactivated successfully.
Jan 20 10:44:14 np0005588920 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000dc.scope: Consumed 13.578s CPU time.
Jan 20 10:44:14 np0005588920 systemd-machined[196121]: Machine qemu-101-instance-000000dc terminated.
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.735 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.780 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [NOTICE]   (313338) : haproxy version is 2.8.14-c23fe91
Jan 20 10:44:14 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [NOTICE]   (313338) : path to executable is /usr/sbin/haproxy
Jan 20 10:44:14 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [WARNING]  (313338) : Exiting Master process...
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.794 226890 INFO nova.virt.libvirt.driver [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Instance destroyed successfully.#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.795 226890 DEBUG nova.objects.instance [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'resources' on Instance uuid 0dd4c943-fb9b-42db-93ef-7199a7deaf1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:44:14 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [ALERT]    (313338) : Current worker (313340) exited with code 143 (Terminated)
Jan 20 10:44:14 np0005588920 neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5[313334]: [WARNING]  (313338) : All workers exited. Exiting... (0)
Jan 20 10:44:14 np0005588920 systemd[1]: libpod-81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d.scope: Deactivated successfully.
Jan 20 10:44:14 np0005588920 podman[313578]: 2026-01-20 15:44:14.805282048 +0000 UTC m=+0.046585410 container died 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.814 226890 DEBUG nova.virt.libvirt.vif [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:43:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-1709505667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=220,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoz2GbOw8ow3DYb4rLFEuMWflz3gP9xv+U5mMCoX7iaUGmWyARnqdL7NG+heU2Zm874PRTc/Yh5PA/F6qY/4DIo5Ys6G6QNvcpeG1KGrWoDYICd41OXW09TeySSA+suuw==',key_name='tempest-TestSecurityGroupsBasicOps-990929955',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:43:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-xg0r85nd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:43:49Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=0dd4c943-fb9b-42db-93ef-7199a7deaf1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.814 226890 DEBUG nova.network.os_vif_util [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.815 226890 DEBUG nova.network.os_vif_util [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.815 226890 DEBUG os_vif [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.817 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.818 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d47bc74-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.819 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.825 226890 INFO os_vif [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:89:ea,bridge_name='br-int',has_traffic_filtering=True,id=0d47bc74-c077-42be-be2e-f197e1e4b5ab,network=Network(b3454bfb-211a-4d3c-9cdd-2add870f0bc5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d47bc74-c0')#033[00m
Jan 20 10:44:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d-userdata-shm.mount: Deactivated successfully.
Jan 20 10:44:14 np0005588920 systemd[1]: var-lib-containers-storage-overlay-582c3dc9400aa1947847fd719984c55c76db3a04746e2dc2ffa11ae707d1ffd9-merged.mount: Deactivated successfully.
Jan 20 10:44:14 np0005588920 podman[313578]: 2026-01-20 15:44:14.845541566 +0000 UTC m=+0.086844938 container cleanup 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:44:14 np0005588920 systemd[1]: libpod-conmon-81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d.scope: Deactivated successfully.
Jan 20 10:44:14 np0005588920 podman[313632]: 2026-01-20 15:44:14.907604946 +0000 UTC m=+0.040736103 container remove 81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.912 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[39d9e3ba-bde0-4e03-a0aa-0372acdbae0d]: (4, ('Tue Jan 20 03:44:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5 (81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d)\n81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d\nTue Jan 20 03:44:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5 (81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d)\n81482c90885c4e7ac27906f88cbe54c7dfc3255354ad4909ddae24300c9e061d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.914 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e04c233c-26da-4e2c-9fba-d0fccaddd755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.915 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3454bfb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.917 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 kernel: tapb3454bfb-20: left promiscuous mode
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.920 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffb6c4a-6e0b-4019-a96f-67c85d3878dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.929 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.935 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f547ff84-e2ea-4fe1-8ad1-75cdb3020f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.936 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7e8a89-7de5-44e2-af7f-150f8cd094fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.942 226890 DEBUG nova.compute.manager [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-unplugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.942 226890 DEBUG oslo_concurrency.lockutils [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.942 226890 DEBUG oslo_concurrency.lockutils [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.942 226890 DEBUG oslo_concurrency.lockutils [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.943 226890 DEBUG nova.compute.manager [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] No waiting events found dispatching network-vif-unplugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:44:14 np0005588920 nova_compute[226886]: 2026-01-20 15:44:14.943 226890 DEBUG nova.compute.manager [req-92269c86-9090-44cd-a100-10e3c1c12e0d req-bcf87a5c-26d1-4060-8a11-75d2080e4354 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-unplugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.950 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[de92cf63-68a3-4850-ab8a-63a033e86a8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887001, 'reachable_time': 41776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313650, 'error': None, 'target': 'ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.953 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3454bfb-211a-4d3c-9cdd-2add870f0bc5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:44:14 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:14.953 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[110b33d2-05ff-408c-a8e6-f7db0b1b2763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:14 np0005588920 systemd[1]: run-netns-ovnmeta\x2db3454bfb\x2d211a\x2d4d3c\x2d9cdd\x2d2add870f0bc5.mount: Deactivated successfully.
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.185 226890 INFO nova.virt.libvirt.driver [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Deleting instance files /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d_del#033[00m
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.186 226890 INFO nova.virt.libvirt.driver [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Deletion of /var/lib/nova/instances/0dd4c943-fb9b-42db-93ef-7199a7deaf1d_del complete#033[00m
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.238 226890 INFO nova.compute.manager [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.239 226890 DEBUG oslo.service.loopingcall [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.239 226890 DEBUG nova.compute.manager [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:44:15 np0005588920 nova_compute[226886]: 2026-01-20 15:44:15.240 226890 DEBUG nova.network.neutron [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:44:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:15.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:16 np0005588920 nova_compute[226886]: 2026-01-20 15:44:16.441 226890 DEBUG nova.network.neutron [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updated VIF entry in instance network info cache for port 0d47bc74-c077-42be-be2e-f197e1e4b5ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:44:16 np0005588920 nova_compute[226886]: 2026-01-20 15:44:16.443 226890 DEBUG nova.network.neutron [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updating instance_info_cache with network_info: [{"id": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "address": "fa:16:3e:1f:89:ea", "network": {"id": "b3454bfb-211a-4d3c-9cdd-2add870f0bc5", "bridge": "br-int", "label": "tempest-network-smoke--1056002555", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d47bc74-c0", "ovs_interfaceid": "0d47bc74-c077-42be-be2e-f197e1e4b5ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:44:16 np0005588920 nova_compute[226886]: 2026-01-20 15:44:16.472 226890 DEBUG oslo_concurrency.lockutils [req-fae57697-f62c-43a6-abbf-703430fb5f76 req-be9fde80-5927-47ec-b81b-6c39e4a418e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-0dd4c943-fb9b-42db-93ef-7199a7deaf1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:16.508 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:16.509 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:16.509 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.040 226890 DEBUG nova.compute.manager [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.040 226890 DEBUG oslo_concurrency.lockutils [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.041 226890 DEBUG oslo_concurrency.lockutils [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.041 226890 DEBUG oslo_concurrency.lockutils [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.041 226890 DEBUG nova.compute.manager [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] No waiting events found dispatching network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.041 226890 WARNING nova.compute.manager [req-a9f03a35-22cd-4cd7-bb35-892b315b1391 req-1bb3eebb-506e-4925-8423-36e4000ab45d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received unexpected event network-vif-plugged-0d47bc74-c077-42be-be2e-f197e1e4b5ab for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:44:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:17.186 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:44:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:17.187 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.187 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:17 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:17.188 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.209 226890 DEBUG nova.network.neutron [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.254 226890 INFO nova.compute.manager [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Took 2.01 seconds to deallocate network for instance.#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.302 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.303 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.365 226890 DEBUG oslo_concurrency.processutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:17.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3037305259' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.812 226890 DEBUG oslo_concurrency.processutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.823 226890 DEBUG nova.compute.provider_tree [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.859 226890 DEBUG nova.scheduler.client.report [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.890 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:17 np0005588920 nova_compute[226886]: 2026-01-20 15:44:17.925 226890 INFO nova.scheduler.client.report [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Deleted allocations for instance 0dd4c943-fb9b-42db-93ef-7199a7deaf1d#033[00m
Jan 20 10:44:18 np0005588920 nova_compute[226886]: 2026-01-20 15:44:18.021 226890 DEBUG oslo_concurrency.lockutils [None req-ce5e50cf-7f18-417c-af42-0562c4e0fa24 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "0dd4c943-fb9b-42db-93ef-7199a7deaf1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:18 np0005588920 podman[313675]: 2026-01-20 15:44:18.03399175 +0000 UTC m=+0.121844596 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:44:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:19 np0005588920 nova_compute[226886]: 2026-01-20 15:44:19.179 226890 DEBUG nova.compute.manager [req-224d4f9a-9c94-4653-b22d-bbb18049387d req-9f5d05b8-7b2e-45e2-965d-099ff768b9b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Received event network-vif-deleted-0d47bc74-c077-42be-be2e-f197e1e4b5ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:19.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:19 np0005588920 nova_compute[226886]: 2026-01-20 15:44:19.737 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:19 np0005588920 nova_compute[226886]: 2026-01-20 15:44:19.819 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:21.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:21.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:23.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.754 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.755 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:44:23 np0005588920 nova_compute[226886]: 2026-01-20 15:44:23.755 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2793886615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.167 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.343 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.344 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4082MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.344 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.344 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.409 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.409 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.426 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.431 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.450 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.451 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.469 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.489 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.500 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.505 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.739 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.819 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2855771777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.967 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.975 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:44:24 np0005588920 nova_compute[226886]: 2026-01-20 15:44:24.999 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:44:25 np0005588920 nova_compute[226886]: 2026-01-20 15:44:25.035 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:44:25 np0005588920 nova_compute[226886]: 2026-01-20 15:44:25.036 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:44:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:25 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:25.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:27 np0005588920 podman[313748]: 2026-01-20 15:44:27.964019734 +0000 UTC m=+0.052853059 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.038 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.038 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.039 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.057 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.058 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:29.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.740 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.793 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923854.7918596, 0dd4c943-fb9b-42db-93ef-7199a7deaf1d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.794 226890 INFO nova.compute.manager [-] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.817 226890 DEBUG nova.compute.manager [None req-af6e0c3e-1edf-4330-8a4d-89612cefbab8 - - - - - -] [instance: 0dd4c943-fb9b-42db-93ef-7199a7deaf1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:44:29 np0005588920 nova_compute[226886]: 2026-01-20 15:44:29.820 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18d786f0 =====
Jan 20 10:44:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18d786f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:31.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:31 np0005588920 radosgw[83324]: beast: 0x7f0a18d786f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:31.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:31 np0005588920 nova_compute[226886]: 2026-01-20 15:44:31.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:32 np0005588920 nova_compute[226886]: 2026-01-20 15:44:32.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:33.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:33.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:33 np0005588920 nova_compute[226886]: 2026-01-20 15:44:33.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:34 np0005588920 nova_compute[226886]: 2026-01-20 15:44:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:34 np0005588920 nova_compute[226886]: 2026-01-20 15:44:34.743 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:34 np0005588920 nova_compute[226886]: 2026-01-20 15:44:34.822 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:35.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:35.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.500 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.501 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.517 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.601 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.602 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.609 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.610 226890 INFO nova.compute.claims [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 20 10:44:36 np0005588920 nova_compute[226886]: 2026-01-20 15:44:36.752 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:37 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:37 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3380038031' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.213 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.220 226890 DEBUG nova.compute.provider_tree [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.235 226890 DEBUG nova.scheduler.client.report [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.259 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.261 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.319 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.319 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.339 226890 INFO nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.353 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.455 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.456 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.457 226890 INFO nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Creating image(s)#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.483 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.510 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.538 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.542 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:37.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:37.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.574 226890 DEBUG nova.policy [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5985ef736503499a9f1d734cabc33ce5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.609 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.609 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.610 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.610 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.639 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.643 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ef848e46-b111-4794-8f38-0d8226550fc3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.886 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 ef848e46-b111-4794-8f38-0d8226550fc3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.244s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:37 np0005588920 nova_compute[226886]: 2026-01-20 15:44:37.957 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] resizing rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.061 226890 DEBUG nova.objects.instance [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'migration_context' on Instance uuid ef848e46-b111-4794-8f38-0d8226550fc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.078 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.079 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Ensure instance console log exists: /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.079 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.079 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.080 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:38 np0005588920 nova_compute[226886]: 2026-01-20 15:44:38.274 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Successfully created port: f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.006 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Successfully updated port: f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.023 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.024 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquired lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.024 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:44:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.147 226890 DEBUG nova.compute.manager [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.148 226890 DEBUG nova.compute.manager [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing instance network info cache due to event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.148 226890 DEBUG oslo_concurrency.lockutils [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:44:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:39.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:39.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.746 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:39 np0005588920 nova_compute[226886]: 2026-01-20 15:44:39.823 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:40 np0005588920 nova_compute[226886]: 2026-01-20 15:44:40.157 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:44:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:41.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:41.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:41 np0005588920 nova_compute[226886]: 2026-01-20 15:44:41.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:41 np0005588920 nova_compute[226886]: 2026-01-20 15:44:41.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:44:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:43.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:43 np0005588920 nova_compute[226886]: 2026-01-20 15:44:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.197 226890 DEBUG nova.network.neutron [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.348 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Releasing lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.349 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Instance network_info: |[{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.349 226890 DEBUG oslo_concurrency.lockutils [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.349 226890 DEBUG nova.network.neutron [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.351 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Start _get_guest_xml network_info=[{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.355 226890 WARNING nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.366 226890 DEBUG nova.virt.libvirt.host [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.366 226890 DEBUG nova.virt.libvirt.host [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.371 226890 DEBUG nova.virt.libvirt.host [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.371 226890 DEBUG nova.virt.libvirt.host [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.372 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.372 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.372 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.373 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.374 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.374 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.374 226890 DEBUG nova.virt.hardware [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.376 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.779 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:44:44 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2712563561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.824 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.842 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.868 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:44 np0005588920 nova_compute[226886]: 2026-01-20 15:44:44.871 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:45 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:44:45 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3294861214' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.336 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.340 226890 DEBUG nova.virt.libvirt.vif [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:44:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=221,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-pdzuahhw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:44:37Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=ef848e46-b111-4794-8f38-0d8226550fc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.341 226890 DEBUG nova.network.os_vif_util [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.343 226890 DEBUG nova.network.os_vif_util [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.345 226890 DEBUG nova.objects.instance [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'pci_devices' on Instance uuid ef848e46-b111-4794-8f38-0d8226550fc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.364 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <uuid>ef848e46-b111-4794-8f38-0d8226550fc3</uuid>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <name>instance-000000dd</name>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <memory>131072</memory>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <vcpu>1</vcpu>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <metadata>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324</nova:name>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:creationTime>2026-01-20 15:44:44</nova:creationTime>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:flavor name="m1.nano">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:memory>128</nova:memory>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:disk>1</nova:disk>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:swap>0</nova:swap>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </nova:flavor>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:owner>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:user uuid="5985ef736503499a9f1d734cabc33ce5">tempest-TestSecurityGroupsBasicOps-342561427-project-member</nova:user>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:project uuid="728662ec7f654a3fb2e53a90b8707d7e">tempest-TestSecurityGroupsBasicOps-342561427</nova:project>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </nova:owner>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <nova:ports>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <nova:port uuid="f0cf81b3-1627-4877-a0ef-eebb8c5a8d98">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        </nova:port>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </nova:ports>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </nova:instance>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </metadata>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <sysinfo type="smbios">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <system>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="serial">ef848e46-b111-4794-8f38-0d8226550fc3</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="uuid">ef848e46-b111-4794-8f38-0d8226550fc3</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </system>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </sysinfo>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <os>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <boot dev="hd"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <smbios mode="sysinfo"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </os>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <features>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <acpi/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <apic/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <vmcoreinfo/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </features>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <clock offset="utc">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <timer name="hpet" present="no"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </clock>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <cpu mode="custom" match="exact">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <model>Nehalem</model>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </cpu>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  <devices>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <disk type="network" device="disk">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ef848e46-b111-4794-8f38-0d8226550fc3_disk">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <target dev="vda" bus="virtio"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <disk type="network" device="cdrom">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <driver type="raw" cache="none"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <source protocol="rbd" name="vms/ef848e46-b111-4794-8f38-0d8226550fc3_disk.config">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </source>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <auth username="openstack">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      </auth>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <target dev="sda" bus="sata"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </disk>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <interface type="ethernet">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <mac address="fa:16:3e:85:f3:9a"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <mtu size="1442"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <target dev="tapf0cf81b3-16"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </interface>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <serial type="pty">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <log file="/var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/console.log" append="off"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </serial>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <video>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <model type="virtio"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </video>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <input type="tablet" bus="usb"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <rng model="virtio">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </rng>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <controller type="usb" index="0"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    <memballoon model="virtio">
Jan 20 10:44:45 np0005588920 nova_compute[226886]:      <stats period="10"/>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:    </memballoon>
Jan 20 10:44:45 np0005588920 nova_compute[226886]:  </devices>
Jan 20 10:44:45 np0005588920 nova_compute[226886]: </domain>
Jan 20 10:44:45 np0005588920 nova_compute[226886]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.366 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Preparing to wait for external event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.367 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.367 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.367 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.368 226890 DEBUG nova.virt.libvirt.vif [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:44:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=221,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-pdzuahhw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:44:37Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=ef848e46-b111-4794-8f38-0d8226550fc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.368 226890 DEBUG nova.network.os_vif_util [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.369 226890 DEBUG nova.network.os_vif_util [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.369 226890 DEBUG os_vif [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.370 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.371 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.371 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.375 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.375 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0cf81b3-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.376 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0cf81b3-16, col_values=(('external_ids', {'iface-id': 'f0cf81b3-1627-4877-a0ef-eebb8c5a8d98', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:f3:9a', 'vm-uuid': 'ef848e46-b111-4794-8f38-0d8226550fc3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.377 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:45 np0005588920 NetworkManager[49076]: <info>  [1768923885.3786] manager: (tapf0cf81b3-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/468)
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.382 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.388 226890 INFO os_vif [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16')#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.442 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.442 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.443 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No VIF found with MAC fa:16:3e:85:f3:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.444 226890 INFO nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Using config drive#033[00m
Jan 20 10:44:45 np0005588920 nova_compute[226886]: 2026-01-20 15:44:45.482 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:45.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:45.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.436 226890 INFO nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Creating config drive at /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.447 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3wz73t2e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.539 226890 DEBUG nova.network.neutron [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updated VIF entry in instance network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.541 226890 DEBUG nova.network.neutron [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.558 226890 DEBUG oslo_concurrency.lockutils [req-c677baff-feaa-49a5-b830-9ccbf6c507c5 req-109b317f-ae66-4214-9fa3-e523365a2621 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.602 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3wz73t2e" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.652 226890 DEBUG nova.storage.rbd_utils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image ef848e46-b111-4794-8f38-0d8226550fc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.658 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config ef848e46-b111-4794-8f38-0d8226550fc3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.853 226890 DEBUG oslo_concurrency.processutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config ef848e46-b111-4794-8f38-0d8226550fc3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.854 226890 INFO nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Deleting local config drive /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3/disk.config because it was imported into RBD.#033[00m
Jan 20 10:44:46 np0005588920 kernel: tapf0cf81b3-16: entered promiscuous mode
Jan 20 10:44:46 np0005588920 NetworkManager[49076]: <info>  [1768923886.9102] manager: (tapf0cf81b3-16): new Tun device (/org/freedesktop/NetworkManager/Devices/469)
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.909 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:46Z|00991|binding|INFO|Claiming lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for this chassis.
Jan 20 10:44:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:46Z|00992|binding|INFO|f0cf81b3-1627-4877-a0ef-eebb8c5a8d98: Claiming fa:16:3e:85:f3:9a 10.100.0.4
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.916 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.924 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:f3:9a 10.100.0.4'], port_security=['fa:16:3e:85:f3:9a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ef848e46-b111-4794-8f38-0d8226550fc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '133c0593-3211-4540-bb4e-2efa6f05d67f 204734f0-d45b-4b06-9def-83db7a4e110a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.926 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 bound to our chassis#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.927 144128 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6567de92-725d-4dcc-97c2-0fec6d9bda84#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.936 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[7fda8b00-a26d-470c-ad0b-49c79f07c578]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:46 np0005588920 systemd-machined[196121]: New machine qemu-102-instance-000000dd.
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.937 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6567de92-71 in ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.938 229672 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6567de92-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.939 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d28e15fd-8f35-444b-879c-ddeca23ce5e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:46 np0005588920 systemd-udevd[314095]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.939 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e2c688-a1bb-429e-abc8-3284dd71e3d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:46 np0005588920 NetworkManager[49076]: <info>  [1768923886.9529] device (tapf0cf81b3-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:44:46 np0005588920 NetworkManager[49076]: <info>  [1768923886.9534] device (tapf0cf81b3-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.953 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[292c5e1f-45e1-4729-9559-fa91a60a3691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:46 np0005588920 systemd[1]: Started Virtual Machine qemu-102-instance-000000dd.
Jan 20 10:44:46 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:46.979 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[e76318ee-1067-421a-95f5-ae6c87fb3e4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.987 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:46Z|00993|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 ovn-installed in OVS
Jan 20 10:44:46 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:46Z|00994|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 up in Southbound
Jan 20 10:44:46 np0005588920 nova_compute[226886]: 2026-01-20 15:44:46.991 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.007 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[789ac7ee-29ec-4275-b2a0-2ac3fa426d4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.013 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2f23810d-5961-4bd0-ab7b-fc601791b881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 NetworkManager[49076]: <info>  [1768923887.0140] manager: (tap6567de92-70): new Veth device (/org/freedesktop/NetworkManager/Devices/470)
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.044 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[6242b36a-1f18-4d34-a9c2-21681eba47c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.047 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[982ff349-42c1-4e2f-bbc4-48794db28e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 NetworkManager[49076]: <info>  [1768923887.0691] device (tap6567de92-70): carrier: link connected
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.074 229728 DEBUG oslo.privsep.daemon [-] privsep: reply[0170db83-44a2-4f57-a220-1480fb812350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.091 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[c542f9cb-e816-4626-a0c5-cf8cc7a595ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6567de92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:26:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 310], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892875, 'reachable_time': 42118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314127, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.106 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a04414a5-6ac5-4c18-8292-80dd40d5e053]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:2665'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 892875, 'tstamp': 892875}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314128, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.123 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2e05294f-599f-4f42-82da-00c848026cc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6567de92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:26:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 310], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892875, 'reachable_time': 42118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 314129, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.163 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a93fad5a-7334-479c-916c-691826c333ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.228 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fa45cf-4fa8-4500-965f-a2d9a87fd224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.230 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6567de92-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.231 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.231 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6567de92-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:47 np0005588920 kernel: tap6567de92-70: entered promiscuous mode
Jan 20 10:44:47 np0005588920 NetworkManager[49076]: <info>  [1768923887.2337] manager: (tap6567de92-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/471)
Jan 20 10:44:47 np0005588920 nova_compute[226886]: 2026-01-20 15:44:47.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.235 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6567de92-70, col_values=(('external_ids', {'iface-id': '49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:47 np0005588920 nova_compute[226886]: 2026-01-20 15:44:47.236 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:47Z|00995|binding|INFO|Releasing lport 49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3 from this chassis (sb_readonly=1)
Jan 20 10:44:47 np0005588920 nova_compute[226886]: 2026-01-20 15:44:47.249 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.250 144128 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.252 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2234cac1-9a10-4f35-974e-236b194611b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.253 144128 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: global
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    log         /dev/log local0 debug
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    log-tag     haproxy-metadata-proxy-6567de92-725d-4dcc-97c2-0fec6d9bda84
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    user        root
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    group       root
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    maxconn     1024
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    pidfile     /var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    daemon
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: defaults
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    log global
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    mode http
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    option httplog
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    option dontlognull
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    option http-server-close
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    option forwardfor
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    retries                 3
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    timeout http-request    30s
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    timeout connect         30s
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    timeout client          32s
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    timeout server          32s
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    timeout http-keep-alive 30s
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: listen listener
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    bind 169.254.169.254:80
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]:    http-request add-header X-OVN-Network-ID 6567de92-725d-4dcc-97c2-0fec6d9bda84
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:44:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:44:47.253 144128 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'env', 'PROCESS_TAG=haproxy-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6567de92-725d-4dcc-97c2-0fec6d9bda84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:44:47 np0005588920 nova_compute[226886]: 2026-01-20 15:44:47.320 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923887.3197222, ef848e46-b111-4794-8f38-0d8226550fc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:44:47 np0005588920 nova_compute[226886]: 2026-01-20 15:44:47.320 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] VM Started (Lifecycle Event)#033[00m
Jan 20 10:44:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:47.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:47 np0005588920 podman[314203]: 2026-01-20 15:44:47.615927336 +0000 UTC m=+0.047005972 container create ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 10:44:47 np0005588920 systemd[1]: Started libpod-conmon-ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e.scope.
Jan 20 10:44:47 np0005588920 systemd[1]: Started libcrun container.
Jan 20 10:44:47 np0005588920 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4f9f97adc10a61e1f369d4b32d1cf3311c2714f59fe66f871e29064931d7527/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:44:47 np0005588920 podman[314203]: 2026-01-20 15:44:47.592683103 +0000 UTC m=+0.023761769 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:44:47 np0005588920 podman[314203]: 2026-01-20 15:44:47.695072283 +0000 UTC m=+0.126150939 container init ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:44:47 np0005588920 podman[314203]: 2026-01-20 15:44:47.699859 +0000 UTC m=+0.130937646 container start ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:44:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [NOTICE]   (314223) : New worker (314225) forked
Jan 20 10:44:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [NOTICE]   (314223) : Loading success.
Jan 20 10:44:49 np0005588920 podman[314234]: 2026-01-20 15:44:49.059107399 +0000 UTC m=+0.135640210 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:44:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.227 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.235 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923887.320808, ef848e46-b111-4794-8f38-0d8226550fc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.235 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.275 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.280 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.321 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:44:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:49.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:49.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.781 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.879 226890 DEBUG nova.compute.manager [req-5035556f-3703-44b3-8e87-0067551dabc4 req-8cfedc67-0aa4-46a1-97ff-db3dc33af8ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.880 226890 DEBUG oslo_concurrency.lockutils [req-5035556f-3703-44b3-8e87-0067551dabc4 req-8cfedc67-0aa4-46a1-97ff-db3dc33af8ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.880 226890 DEBUG oslo_concurrency.lockutils [req-5035556f-3703-44b3-8e87-0067551dabc4 req-8cfedc67-0aa4-46a1-97ff-db3dc33af8ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.880 226890 DEBUG oslo_concurrency.lockutils [req-5035556f-3703-44b3-8e87-0067551dabc4 req-8cfedc67-0aa4-46a1-97ff-db3dc33af8ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.880 226890 DEBUG nova.compute.manager [req-5035556f-3703-44b3-8e87-0067551dabc4 req-8cfedc67-0aa4-46a1-97ff-db3dc33af8ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Processing event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.881 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.884 226890 DEBUG nova.virt.driver [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] Emitting event <LifecycleEvent: 1768923889.8842938, ef848e46-b111-4794-8f38-0d8226550fc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.885 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.888 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.892 226890 INFO nova.virt.libvirt.driver [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Instance spawned successfully.#033[00m
Jan 20 10:44:49 np0005588920 nova_compute[226886]: 2026-01-20 15:44:49.893 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.379 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.797 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.804 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.804 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.805 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.806 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.807 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.808 226890 DEBUG nova.virt.libvirt.driver [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.814 226890 DEBUG nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:44:50 np0005588920 nova_compute[226886]: 2026-01-20 15:44:50.985 226890 INFO nova.compute.manager [None req-5f5ca392-53d3-458e-b35e-ca8caf59cc0b - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.449 226890 INFO nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Took 13.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.450 226890 DEBUG nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:44:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:51.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:51.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.971 226890 DEBUG nova.compute.manager [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.971 226890 DEBUG oslo_concurrency.lockutils [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.971 226890 DEBUG oslo_concurrency.lockutils [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.971 226890 DEBUG oslo_concurrency.lockutils [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.972 226890 DEBUG nova.compute.manager [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.972 226890 WARNING nova.compute.manager [req-2a353d0b-726b-4ceb-898b-a278a4f4e51a req-87fc3cd9-fe8d-40d7-9d93-b7104a021c4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received unexpected event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.973 226890 INFO nova.compute.manager [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Took 15.40 seconds to build instance.#033[00m
Jan 20 10:44:51 np0005588920 nova_compute[226886]: 2026-01-20 15:44:51.994 226890 DEBUG oslo_concurrency.lockutils [None req-31096d12-7515-4430-b09e-3730d1471db0 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:53.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:53.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:53 np0005588920 nova_compute[226886]: 2026-01-20 15:44:53.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:54 np0005588920 nova_compute[226886]: 2026-01-20 15:44:54.782 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:55 np0005588920 nova_compute[226886]: 2026-01-20 15:44:55.381 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:44:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:44:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:55.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:57.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:58 np0005588920 podman[314259]: 2026-01-20 15:44:58.965034775 +0000 UTC m=+0.051503200 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:44:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:59 np0005588920 nova_compute[226886]: 2026-01-20 15:44:59.550 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:59 np0005588920 nova_compute[226886]: 2026-01-20 15:44:59.554 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:59 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:59Z|00996|binding|INFO|Releasing lport 49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3 from this chassis (sb_readonly=0)
Jan 20 10:44:59 np0005588920 NetworkManager[49076]: <info>  [1768923899.5565] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/472)
Jan 20 10:44:59 np0005588920 NetworkManager[49076]: <info>  [1768923899.5576] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/473)
Jan 20 10:44:59 np0005588920 ovn_controller[133971]: 2026-01-20T15:44:59Z|00997|binding|INFO|Releasing lport 49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3 from this chassis (sb_readonly=0)
Jan 20 10:44:59 np0005588920 nova_compute[226886]: 2026-01-20 15:44:59.558 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:59.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:44:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:59.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:59 np0005588920 nova_compute[226886]: 2026-01-20 15:44:59.784 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:00 np0005588920 nova_compute[226886]: 2026-01-20 15:45:00.383 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:01 np0005588920 nova_compute[226886]: 2026-01-20 15:45:01.398 226890 DEBUG nova.compute.manager [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:01 np0005588920 nova_compute[226886]: 2026-01-20 15:45:01.398 226890 DEBUG nova.compute.manager [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing instance network info cache due to event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:45:01 np0005588920 nova_compute[226886]: 2026-01-20 15:45:01.398 226890 DEBUG oslo_concurrency.lockutils [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:01 np0005588920 nova_compute[226886]: 2026-01-20 15:45:01.399 226890 DEBUG oslo_concurrency.lockutils [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:01 np0005588920 nova_compute[226886]: 2026-01-20 15:45:01.399 226890 DEBUG nova.network.neutron [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:45:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:01.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:02Z|00141|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:f3:9a 10.100.0.4
Jan 20 10:45:02 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:02Z|00142|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:f3:9a 10.100.0.4
Jan 20 10:45:03 np0005588920 nova_compute[226886]: 2026-01-20 15:45:03.530 226890 DEBUG nova.network.neutron [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updated VIF entry in instance network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:45:03 np0005588920 nova_compute[226886]: 2026-01-20 15:45:03.531 226890 DEBUG nova.network.neutron [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:03 np0005588920 nova_compute[226886]: 2026-01-20 15:45:03.551 226890 DEBUG oslo_concurrency.lockutils [req-150c1013-3226-4cd0-92ab-72cb1388af18 req-33dfdabb-82c6-4206-9c5e-adac6df417b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:03.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:03 np0005588920 nova_compute[226886]: 2026-01-20 15:45:03.739 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:03 np0005588920 nova_compute[226886]: 2026-01-20 15:45:03.740 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:45:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:04 np0005588920 nova_compute[226886]: 2026-01-20 15:45:04.787 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:05 np0005588920 nova_compute[226886]: 2026-01-20 15:45:05.385 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:05.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:05.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:06 np0005588920 nova_compute[226886]: 2026-01-20 15:45:06.737 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:06 np0005588920 nova_compute[226886]: 2026-01-20 15:45:06.738 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:45:06 np0005588920 nova_compute[226886]: 2026-01-20 15:45:06.775 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:45:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:07.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:09.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:09 np0005588920 nova_compute[226886]: 2026-01-20 15:45:09.789 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:10 np0005588920 nova_compute[226886]: 2026-01-20 15:45:10.387 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:11.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:45:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:12 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:45:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:13.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:45:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1310749465' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:45:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:45:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1310749465' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:45:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:14 np0005588920 nova_compute[226886]: 2026-01-20 15:45:14.791 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:15 np0005588920 nova_compute[226886]: 2026-01-20 15:45:15.390 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:15.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:15.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:16.509 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:16.511 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:17.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:17.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.676603) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918676666, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 2361, "num_deletes": 251, "total_data_size": 5876771, "memory_usage": 5943520, "flush_reason": "Manual Compaction"}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918705443, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 3835390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 87234, "largest_seqno": 89590, "table_properties": {"data_size": 3825792, "index_size": 6091, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19644, "raw_average_key_size": 20, "raw_value_size": 3806783, "raw_average_value_size": 3953, "num_data_blocks": 265, "num_entries": 963, "num_filter_entries": 963, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923702, "oldest_key_time": 1768923702, "file_creation_time": 1768923918, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 28899 microseconds, and 9497 cpu microseconds.
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.705496) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 3835390 bytes OK
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.705519) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707875) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707895) EVENT_LOG_v1 {"time_micros": 1768923918707889, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707913) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 5866401, prev total WAL file size 5866401, number of live WAL files 2.
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.709206) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(3745KB)], [180(12MB)]
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918709271, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 16460021, "oldest_snapshot_seqno": -1}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 11033 keys, 14481962 bytes, temperature: kUnknown
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918827951, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 14481962, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14410693, "index_size": 42612, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27589, "raw_key_size": 290284, "raw_average_key_size": 26, "raw_value_size": 14217676, "raw_average_value_size": 1288, "num_data_blocks": 1625, "num_entries": 11033, "num_filter_entries": 11033, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923918, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.828239) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 14481962 bytes
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.829853) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.6 rd, 121.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.0 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(8.1) write-amplify(3.8) OK, records in: 11552, records dropped: 519 output_compression: NoCompression
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.829873) EVENT_LOG_v1 {"time_micros": 1768923918829864, "job": 116, "event": "compaction_finished", "compaction_time_micros": 118780, "compaction_time_cpu_micros": 34026, "output_level": 6, "num_output_files": 1, "total_output_size": 14481962, "num_input_records": 11552, "num_output_records": 11033, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918830643, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918833088, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.709072) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.833162) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.833168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.833170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.833172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:45:18.833174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:19.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:19.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:19 np0005588920 nova_compute[226886]: 2026-01-20 15:45:19.838 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588920 podman[314462]: 2026-01-20 15:45:20.010276304 +0000 UTC m=+0.086354844 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible)
Jan 20 10:45:20 np0005588920 nova_compute[226886]: 2026-01-20 15:45:20.391 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:21.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:21.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:22 np0005588920 nova_compute[226886]: 2026-01-20 15:45:22.506 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:22.507 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:22 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:22.507 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:45:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:23.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:23.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.763 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.873 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.873 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.874 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.874 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:45:23 np0005588920 nova_compute[226886]: 2026-01-20 15:45:23.874 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3839987835' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.298 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.376 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000dd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.377 226890 DEBUG nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] skipping disk for instance-000000dd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.525 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.526 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3932MB free_disk=20.921974182128906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.526 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.526 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.616 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Instance ef848e46-b111-4794-8f38-0d8226550fc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.617 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.617 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.655 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:24 np0005588920 nova_compute[226886]: 2026-01-20 15:45:24.901 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092852698' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.135 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.141 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.177 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.207 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.208 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:25 np0005588920 nova_compute[226886]: 2026-01-20 15:45:25.392 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:25.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:25.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:26 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:26.509 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:27.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:27.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.169 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.170 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.170 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.442 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.443 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquired lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.443 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.444 226890 DEBUG nova.objects.instance [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ef848e46-b111-4794-8f38-0d8226550fc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:29.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:29.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:29 np0005588920 nova_compute[226886]: 2026-01-20 15:45:29.902 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:29 np0005588920 podman[314535]: 2026-01-20 15:45:29.975975015 +0000 UTC m=+0.055196405 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 10:45:30 np0005588920 nova_compute[226886]: 2026-01-20 15:45:30.394 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:31.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:31.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:31 np0005588920 nova_compute[226886]: 2026-01-20 15:45:31.819 226890 DEBUG nova.network.neutron [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:31 np0005588920 nova_compute[226886]: 2026-01-20 15:45:31.840 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Releasing lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:31 np0005588920 nova_compute[226886]: 2026-01-20 15:45:31.840 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:45:31 np0005588920 nova_compute[226886]: 2026-01-20 15:45:31.841 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:32 np0005588920 nova_compute[226886]: 2026-01-20 15:45:32.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:33.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:33.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:34 np0005588920 nova_compute[226886]: 2026-01-20 15:45:34.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:34 np0005588920 nova_compute[226886]: 2026-01-20 15:45:34.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:34 np0005588920 nova_compute[226886]: 2026-01-20 15:45:34.942 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:35 np0005588920 nova_compute[226886]: 2026-01-20 15:45:35.396 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:35.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:35.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:35 np0005588920 nova_compute[226886]: 2026-01-20 15:45:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:37.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:39.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:39 np0005588920 nova_compute[226886]: 2026-01-20 15:45:39.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:39 np0005588920 nova_compute[226886]: 2026-01-20 15:45:39.949 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:40 np0005588920 nova_compute[226886]: 2026-01-20 15:45:40.398 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:41.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:41.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:41 np0005588920 nova_compute[226886]: 2026-01-20 15:45:41.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:41 np0005588920 nova_compute[226886]: 2026-01-20 15:45:41.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:45:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:43.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:43.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:43 np0005588920 nova_compute[226886]: 2026-01-20 15:45:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:44 np0005588920 nova_compute[226886]: 2026-01-20 15:45:44.953 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:45 np0005588920 nova_compute[226886]: 2026-01-20 15:45:45.401 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:45.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:45.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.312 226890 DEBUG nova.compute.manager [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.312 226890 DEBUG nova.compute.manager [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing instance network info cache due to event network-changed-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.313 226890 DEBUG oslo_concurrency.lockutils [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.313 226890 DEBUG oslo_concurrency.lockutils [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.313 226890 DEBUG nova.network.neutron [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Refreshing network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.437 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.438 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.438 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.438 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.439 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.440 226890 INFO nova.compute.manager [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Terminating instance#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.440 226890 DEBUG nova.compute.manager [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:45:47 np0005588920 kernel: tapf0cf81b3-16 (unregistering): left promiscuous mode
Jan 20 10:45:47 np0005588920 NetworkManager[49076]: <info>  [1768923947.5030] device (tapf0cf81b3-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.521 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|00998|binding|INFO|Releasing lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 from this chassis (sb_readonly=0)
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|00999|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 down in Southbound
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01000|binding|INFO|Removing iface tapf0cf81b3-16 ovn-installed in OVS
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.523 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.529 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:f3:9a 10.100.0.4'], port_security=['fa:16:3e:85:f3:9a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ef848e46-b111-4794-8f38-0d8226550fc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '133c0593-3211-4540-bb4e-2efa6f05d67f 204734f0-d45b-4b06-9def-83db7a4e110a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.530 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 unbound from our chassis#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.531 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6567de92-725d-4dcc-97c2-0fec6d9bda84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.532 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4938c4-bd66-4865-ba1e-2529dd2eed50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.533 144128 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 namespace which is not needed anymore#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.563 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000dd.scope: Deactivated successfully.
Jan 20 10:45:47 np0005588920 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000dd.scope: Consumed 15.024s CPU time.
Jan 20 10:45:47 np0005588920 systemd-machined[196121]: Machine qemu-102-instance-000000dd terminated.
Jan 20 10:45:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:47.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:47.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:47 np0005588920 kernel: tapf0cf81b3-16: entered promiscuous mode
Jan 20 10:45:47 np0005588920 systemd-udevd[314559]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:45:47 np0005588920 NetworkManager[49076]: <info>  [1768923947.6704] manager: (tapf0cf81b3-16): new Tun device (/org/freedesktop/NetworkManager/Devices/474)
Jan 20 10:45:47 np0005588920 kernel: tapf0cf81b3-16 (unregistering): left promiscuous mode
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.673 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01001|binding|INFO|Claiming lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for this chassis.
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01002|binding|INFO|f0cf81b3-1627-4877-a0ef-eebb8c5a8d98: Claiming fa:16:3e:85:f3:9a 10.100.0.4
Jan 20 10:45:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [NOTICE]   (314223) : haproxy version is 2.8.14-c23fe91
Jan 20 10:45:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [NOTICE]   (314223) : path to executable is /usr/sbin/haproxy
Jan 20 10:45:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [WARNING]  (314223) : Exiting Master process...
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.681 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:f3:9a 10.100.0.4'], port_security=['fa:16:3e:85:f3:9a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ef848e46-b111-4794-8f38-0d8226550fc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '133c0593-3211-4540-bb4e-2efa6f05d67f 204734f0-d45b-4b06-9def-83db7a4e110a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [ALERT]    (314223) : Current worker (314225) exited with code 143 (Terminated)
Jan 20 10:45:47 np0005588920 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[314219]: [WARNING]  (314223) : All workers exited. Exiting... (0)
Jan 20 10:45:47 np0005588920 systemd[1]: libpod-ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e.scope: Deactivated successfully.
Jan 20 10:45:47 np0005588920 conmon[314219]: conmon ec4bd487349286408095 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e.scope/container/memory.events
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.695 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01003|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 ovn-installed in OVS
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01004|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 up in Southbound
Jan 20 10:45:47 np0005588920 podman[314580]: 2026-01-20 15:45:47.697823056 +0000 UTC m=+0.063162443 container died ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01005|binding|INFO|Releasing lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 from this chassis (sb_readonly=1)
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01006|if_status|INFO|Dropped 2 log messages in last 1707 seconds (most recently, 1707 seconds ago) due to excessive rate
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01007|if_status|INFO|Not setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 down as sb is readonly
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.701 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01008|binding|INFO|Removing iface tapf0cf81b3-16 ovn-installed in OVS
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.705 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.707 226890 INFO nova.virt.libvirt.driver [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Instance destroyed successfully.#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.707 226890 DEBUG nova.objects.instance [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'resources' on Instance uuid ef848e46-b111-4794-8f38-0d8226550fc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01009|binding|INFO|Releasing lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 from this chassis (sb_readonly=0)
Jan 20 10:45:47 np0005588920 ovn_controller[133971]: 2026-01-20T15:45:47Z|01010|binding|INFO|Setting lport f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 down in Southbound
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.714 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.719 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:f3:9a 10.100.0.4'], port_security=['fa:16:3e:85:f3:9a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ef848e46-b111-4794-8f38-0d8226550fc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '133c0593-3211-4540-bb4e-2efa6f05d67f 204734f0-d45b-4b06-9def-83db7a4e110a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>], logical_port=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f79259b3880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.724 226890 DEBUG nova.virt.libvirt.vif [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:44:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-access_point-2085009324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-acc',id=221,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:44:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-pdzuahhw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:44:51Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=ef848e46-b111-4794-8f38-0d8226550fc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.725 226890 DEBUG nova.network.os_vif_util [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.725 226890 DEBUG nova.network.os_vif_util [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.726 226890 DEBUG os_vif [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.729 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.730 226890 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0cf81b3-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.731 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e-userdata-shm.mount: Deactivated successfully.
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.733 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 systemd[1]: var-lib-containers-storage-overlay-a4f9f97adc10a61e1f369d4b32d1cf3311c2714f59fe66f871e29064931d7527-merged.mount: Deactivated successfully.
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.736 226890 INFO os_vif [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:f3:9a,bridge_name='br-int',has_traffic_filtering=True,id=f0cf81b3-1627-4877-a0ef-eebb8c5a8d98,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0cf81b3-16')#033[00m
Jan 20 10:45:47 np0005588920 podman[314580]: 2026-01-20 15:45:47.740519373 +0000 UTC m=+0.105858750 container cleanup ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:45:47 np0005588920 systemd[1]: libpod-conmon-ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e.scope: Deactivated successfully.
Jan 20 10:45:47 np0005588920 podman[314626]: 2026-01-20 15:45:47.80492041 +0000 UTC m=+0.041353460 container remove ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.810 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[866e10bc-1dd8-4c33-82da-634ba5d40606]: (4, ('Tue Jan 20 03:45:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 (ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e)\nec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e\nTue Jan 20 03:45:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 (ec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e)\nec4bd4873492864080959b4f3ff59a0f712b89155989101b87c499f852f6181e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.812 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d752564f-ceed-4b95-a59c-9c7aa34efd23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.813 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6567de92-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:47 np0005588920 kernel: tap6567de92-70: left promiscuous mode
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.815 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.821 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[cde9a730-491e-4b19-8df4-c03778edc141]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 nova_compute[226886]: 2026-01-20 15:45:47.832 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.846 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[ae23907d-3977-4acf-b08a-b9581772a393]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.848 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb50ab5-4327-46bc-94d8-92df5a7f64f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.867 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ef4475-7998-4c13-8337-392cc455dafa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892868, 'reachable_time': 21360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314645, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 systemd[1]: run-netns-ovnmeta\x2d6567de92\x2d725d\x2d4dcc\x2d97c2\x2d0fec6d9bda84.mount: Deactivated successfully.
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.872 144293 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.872 144293 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f1dc73-c164-4179-b827-80afc5fe04c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.874 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 unbound from our chassis#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.876 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6567de92-725d-4dcc-97c2-0fec6d9bda84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.878 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[19acd295-b680-4ae1-be52-903f1c35fd68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.879 144128 INFO neutron.agent.ovn.metadata.agent [-] Port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 unbound from our chassis#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.880 144128 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6567de92-725d-4dcc-97c2-0fec6d9bda84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:45:47 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:45:47.881 229672 DEBUG oslo.privsep.daemon [-] privsep: reply[8a8571ab-d9fd-4009-8102-f5c0ea01bcca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.129 226890 INFO nova.virt.libvirt.driver [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Deleting instance files /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3_del#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.130 226890 INFO nova.virt.libvirt.driver [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Deletion of /var/lib/nova/instances/ef848e46-b111-4794-8f38-0d8226550fc3_del complete#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.179 226890 INFO nova.compute.manager [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Took 0.74 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.179 226890 DEBUG oslo.service.loopingcall [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.180 226890 DEBUG nova.compute.manager [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:45:48 np0005588920 nova_compute[226886]: 2026-01-20 15:45:48.180 226890 DEBUG nova.network.neutron [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:45:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.249 226890 DEBUG nova.network.neutron [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updated VIF entry in instance network info cache for port f0cf81b3-1627-4877-a0ef-eebb8c5a8d98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.250 226890 DEBUG nova.network.neutron [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [{"id": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "address": "fa:16:3e:85:f3:9a", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0cf81b3-16", "ovs_interfaceid": "f0cf81b3-1627-4877-a0ef-eebb8c5a8d98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.435 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-unplugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.435 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-unplugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-unplugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.436 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 WARNING nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received unexpected event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.437 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.438 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.438 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.438 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.438 226890 WARNING nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received unexpected event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.438 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.439 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.439 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.439 226890 DEBUG oslo_concurrency.lockutils [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.439 226890 DEBUG nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.439 226890 WARNING nova.compute.manager [req-1e2d59c1-95d5-427b-bc08-e72451fbf502 req-c1f99f12-df8c-40c9-a5df-8f0d0fd69d1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received unexpected event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.458 226890 DEBUG oslo_concurrency.lockutils [req-85a72e41-4a75-49c1-8e62-922ce9603745 req-97e9fd7a-fde6-4323-8968-9184d2b98421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-ef848e46-b111-4794-8f38-0d8226550fc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.560 226890 DEBUG nova.network.neutron [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.580 226890 INFO nova.compute.manager [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Took 1.40 seconds to deallocate network for instance.#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.622 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.622 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:49.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:49.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.668 226890 DEBUG oslo_concurrency.processutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:49 np0005588920 nova_compute[226886]: 2026-01-20 15:45:49.955 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:50 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:50 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3862295260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.150 226890 DEBUG oslo_concurrency.processutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.155 226890 DEBUG nova.compute.provider_tree [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.172 226890 DEBUG nova.scheduler.client.report [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.194 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.226 226890 INFO nova.scheduler.client.report [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Deleted allocations for instance ef848e46-b111-4794-8f38-0d8226550fc3#033[00m
Jan 20 10:45:50 np0005588920 nova_compute[226886]: 2026-01-20 15:45:50.322 226890 DEBUG oslo_concurrency.lockutils [None req-fa3e1558-40f4-4b7b-a5e8-d36e179c4f80 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:51 np0005588920 podman[314669]: 2026-01-20 15:45:51.049045222 +0000 UTC m=+0.132557072 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.552 226890 DEBUG nova.compute.manager [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.553 226890 DEBUG oslo_concurrency.lockutils [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.553 226890 DEBUG oslo_concurrency.lockutils [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.553 226890 DEBUG oslo_concurrency.lockutils [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "ef848e46-b111-4794-8f38-0d8226550fc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.553 226890 DEBUG nova.compute.manager [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] No waiting events found dispatching network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.554 226890 WARNING nova.compute.manager [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received unexpected event network-vif-plugged-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:45:51 np0005588920 nova_compute[226886]: 2026-01-20 15:45:51.554 226890 DEBUG nova.compute.manager [req-e494b347-85e3-4a24-a490-de89cfaed080 req-88192877-e98f-49ff-9cc1-8d1f8f861304 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Received event network-vif-deleted-f0cf81b3-1627-4877-a0ef-eebb8c5a8d98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:51.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:51.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:52 np0005588920 nova_compute[226886]: 2026-01-20 15:45:52.732 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:53.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:53.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:54 np0005588920 nova_compute[226886]: 2026-01-20 15:45:54.956 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:55.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:55.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:55 np0005588920 nova_compute[226886]: 2026-01-20 15:45:55.968 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:55 np0005588920 nova_compute[226886]: 2026-01-20 15:45:55.980 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:57.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:57.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:57 np0005588920 nova_compute[226886]: 2026-01-20 15:45:57.733 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:59.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:45:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:59.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:59 np0005588920 nova_compute[226886]: 2026-01-20 15:45:59.959 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:00 np0005588920 podman[314697]: 2026-01-20 15:46:00.962387029 +0000 UTC m=+0.054664411 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 20 10:46:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:01.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:01.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:02 np0005588920 nova_compute[226886]: 2026-01-20 15:46:02.702 226890 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923947.6997662, ef848e46-b111-4794-8f38-0d8226550fc3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:46:02 np0005588920 nova_compute[226886]: 2026-01-20 15:46:02.703 226890 INFO nova.compute.manager [-] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:46:02 np0005588920 nova_compute[226886]: 2026-01-20 15:46:02.735 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:03.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:03.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:04 np0005588920 nova_compute[226886]: 2026-01-20 15:46:04.962 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:05.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:05.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:07.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:07.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:07 np0005588920 nova_compute[226886]: 2026-01-20 15:46:07.738 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.556401) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968556433, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 687, "num_deletes": 250, "total_data_size": 1245853, "memory_usage": 1266808, "flush_reason": "Manual Compaction"}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968560980, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 531405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 89595, "largest_seqno": 90277, "table_properties": {"data_size": 528539, "index_size": 837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7711, "raw_average_key_size": 20, "raw_value_size": 522605, "raw_average_value_size": 1382, "num_data_blocks": 38, "num_entries": 378, "num_filter_entries": 378, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923919, "oldest_key_time": 1768923919, "file_creation_time": 1768923968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 4609 microseconds, and 1936 cpu microseconds.
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.561010) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 531405 bytes OK
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.561023) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.562568) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.562579) EVENT_LOG_v1 {"time_micros": 1768923968562576, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.562593) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 1242162, prev total WAL file size 1242162, number of live WAL files 2.
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.563101) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303039' seq:72057594037927935, type:22 .. '6D6772737461740033323630' seq:0, type:0; will stop at (end)
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(518KB)], [183(13MB)]
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968563179, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 15013367, "oldest_snapshot_seqno": -1}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10921 keys, 11470520 bytes, temperature: kUnknown
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968694050, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11470520, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11404205, "index_size": 37914, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 288131, "raw_average_key_size": 26, "raw_value_size": 11217286, "raw_average_value_size": 1027, "num_data_blocks": 1430, "num_entries": 10921, "num_filter_entries": 10921, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768923968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.694577) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11470520 bytes
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.699447) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.7 rd, 87.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 13.8 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(49.8) write-amplify(21.6) OK, records in: 11411, records dropped: 490 output_compression: NoCompression
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.699468) EVENT_LOG_v1 {"time_micros": 1768923968699458, "job": 118, "event": "compaction_finished", "compaction_time_micros": 130907, "compaction_time_cpu_micros": 56202, "output_level": 6, "num_output_files": 1, "total_output_size": 11470520, "num_input_records": 11411, "num_output_records": 10921, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968699718, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968702967, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.562982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.703069) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.703076) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.703079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.703081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:46:08.703084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:09.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:09.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:09 np0005588920 nova_compute[226886]: 2026-01-20 15:46:09.964 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:11.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:11.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:12 np0005588920 nova_compute[226886]: 2026-01-20 15:46:12.740 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:13.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:46:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2810297471' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:46:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:46:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2810297471' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:46:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:14 np0005588920 nova_compute[226886]: 2026-01-20 15:46:14.966 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:15.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:15.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:46:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:46:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:46:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:46:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:46:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:17.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:17.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:17 np0005588920 nova_compute[226886]: 2026-01-20 15:46:17.741 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:46:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:19 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:46:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:19.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:19.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:19 np0005588920 nova_compute[226886]: 2026-01-20 15:46:19.969 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:21.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:21.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:22 np0005588920 podman[314850]: 2026-01-20 15:46:22.037696016 +0000 UTC m=+0.113714915 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 20 10:46:22 np0005588920 nova_compute[226886]: 2026-01-20 15:46:22.743 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:23.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:24 np0005588920 nova_compute[226886]: 2026-01-20 15:46:24.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:46:24 np0005588920 nova_compute[226886]: 2026-01-20 15:46:24.971 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:25.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:25.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:26 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:27 np0005588920 ovn_controller[133971]: 2026-01-20T15:46:27Z|01011|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 10:46:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:27.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:27.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:27 np0005588920 nova_compute[226886]: 2026-01-20 15:46:27.745 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:29.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:29.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:29 np0005588920 nova_compute[226886]: 2026-01-20 15:46:29.973 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:31.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:31.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:32 np0005588920 podman[314926]: 2026-01-20 15:46:32.004125107 +0000 UTC m=+0.079061516 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 20 10:46:32 np0005588920 nova_compute[226886]: 2026-01-20 15:46:32.747 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:33.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:33.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:34 np0005588920 nova_compute[226886]: 2026-01-20 15:46:34.975 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:35.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:35.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:37.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:37.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:37 np0005588920 nova_compute[226886]: 2026-01-20 15:46:37.749 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:39.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:39.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:40 np0005588920 nova_compute[226886]: 2026-01-20 15:46:40.014 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:41.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:41.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:42 np0005588920 nova_compute[226886]: 2026-01-20 15:46:42.751 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:43.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:43.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:45 np0005588920 nova_compute[226886]: 2026-01-20 15:46:45.019 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:45.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:45.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:47.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:47.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:47 np0005588920 nova_compute[226886]: 2026-01-20 15:46:47.763 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:49.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:49.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:50 np0005588920 nova_compute[226886]: 2026-01-20 15:46:50.020 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:50 np0005588920 nova_compute[226886]: 2026-01-20 15:46:50.923 226890 DEBUG nova.compute.manager [None req-8565af87-c261-4378-8da1-4b625e17a8c2 - - - - - -] [instance: ef848e46-b111-4794-8f38-0d8226550fc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:46:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:51.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:51.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:52 np0005588920 nova_compute[226886]: 2026-01-20 15:46:52.813 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:52 np0005588920 podman[314947]: 2026-01-20 15:46:52.993142675 +0000 UTC m=+0.083907714 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:46:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:53.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:53.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:55 np0005588920 nova_compute[226886]: 2026-01-20 15:46:55.023 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:55.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:55.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:57.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:57.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:57 np0005588920 nova_compute[226886]: 2026-01-20 15:46:57.816 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:46:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:59.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:00 np0005588920 nova_compute[226886]: 2026-01-20 15:47:00.025 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:01.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:01.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:02 np0005588920 nova_compute[226886]: 2026-01-20 15:47:02.859 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:02 np0005588920 podman[314973]: 2026-01-20 15:47:02.97444675 +0000 UTC m=+0.058484369 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:47:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:03.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:03.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:05 np0005588920 nova_compute[226886]: 2026-01-20 15:47:05.068 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:05.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:05.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:07.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:07.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:07 np0005588920 nova_compute[226886]: 2026-01-20 15:47:07.862 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.577992) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028578038, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 805, "num_deletes": 254, "total_data_size": 1549312, "memory_usage": 1577056, "flush_reason": "Manual Compaction"}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028588801, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1022693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90282, "largest_seqno": 91082, "table_properties": {"data_size": 1018873, "index_size": 1599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8317, "raw_average_key_size": 18, "raw_value_size": 1011251, "raw_average_value_size": 2298, "num_data_blocks": 71, "num_entries": 440, "num_filter_entries": 440, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923968, "oldest_key_time": 1768923968, "file_creation_time": 1768924028, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 10874 microseconds, and 3278 cpu microseconds.
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.588864) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1022693 bytes OK
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.588884) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591170) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591185) EVENT_LOG_v1 {"time_micros": 1768924028591181, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591217) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 1545139, prev total WAL file size 1545139, number of live WAL files 2.
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591788) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353233' seq:72057594037927935, type:22 .. '6C6F676D0033373734' seq:0, type:0; will stop at (end)
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(998KB)], [186(10MB)]
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028591863, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12493213, "oldest_snapshot_seqno": -1}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10841 keys, 12373371 bytes, temperature: kUnknown
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028670630, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12373371, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12306145, "index_size": 39017, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 287371, "raw_average_key_size": 26, "raw_value_size": 12119272, "raw_average_value_size": 1117, "num_data_blocks": 1475, "num_entries": 10841, "num_filter_entries": 10841, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768924028, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.671556) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12373371 bytes
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.673119) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.4 rd, 156.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 10.9 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(24.3) write-amplify(12.1) OK, records in: 11361, records dropped: 520 output_compression: NoCompression
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.673146) EVENT_LOG_v1 {"time_micros": 1768924028673134, "job": 120, "event": "compaction_finished", "compaction_time_micros": 78891, "compaction_time_cpu_micros": 30144, "output_level": 6, "num_output_files": 1, "total_output_size": 12373371, "num_input_records": 11361, "num_output_records": 10841, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028673756, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028676592, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.676801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.676809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.676811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.676813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:47:08.676820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:09.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:09.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:10 np0005588920 nova_compute[226886]: 2026-01-20 15:47:10.070 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:11.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:12 np0005588920 nova_compute[226886]: 2026-01-20 15:47:12.898 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:47:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/559030029' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:47:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:47:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/559030029' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:47:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:13.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:15 np0005588920 nova_compute[226886]: 2026-01-20 15:47:15.072 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:15.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:15.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:16 np0005588920 nova_compute[226886]: 2026-01-20 15:47:16.017 226890 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 59.02 sec#033[00m
Jan 20 10:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:16.510 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.022 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.022 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.022 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.022 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.023 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/536885463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.465 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.666 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4138MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.667 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.668 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:17.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:17.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.869 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.870 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:47:17 np0005588920 nova_compute[226886]: 2026-01-20 15:47:17.930 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.100 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:18.419 144128 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:47:18 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:18.421 144128 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.421 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1097814839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.547 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.553 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.593 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.667 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:47:18 np0005588920 nova_compute[226886]: 2026-01-20 15:47:18.667 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:19 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:47:19.424 144128 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c9bfe4c-7684-437c-a64a-33562743d048, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:47:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:19.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:19.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:20 np0005588920 nova_compute[226886]: 2026-01-20 15:47:20.073 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.667 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.668 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.668 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.668 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.688 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.689 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.689 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.690 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.691 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.691 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.691 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:21 np0005588920 nova_compute[226886]: 2026-01-20 15:47:21.692 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:47:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:21.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:21.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:22 np0005588920 nova_compute[226886]: 2026-01-20 15:47:22.979 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:23.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:23.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:23 np0005588920 podman[315037]: 2026-01-20 15:47:23.98990373 +0000 UTC m=+0.074190927 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:47:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.075 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:25.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:25.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.822 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.823 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.823 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.823 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:47:25 np0005588920 nova_compute[226886]: 2026-01-20 15:47:25.824 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.287 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.459 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.461 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4101MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.462 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.462 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.927 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:47:26 np0005588920 nova_compute[226886]: 2026-01-20 15:47:26.927 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:47:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:47:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:27 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:47:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:27.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:27.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:27 np0005588920 nova_compute[226886]: 2026-01-20 15:47:27.932 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:27 np0005588920 nova_compute[226886]: 2026-01-20 15:47:27.982 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4258930174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:28 np0005588920 nova_compute[226886]: 2026-01-20 15:47:28.386 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:28 np0005588920 nova_compute[226886]: 2026-01-20 15:47:28.390 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:47:28 np0005588920 nova_compute[226886]: 2026-01-20 15:47:28.409 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:47:28 np0005588920 nova_compute[226886]: 2026-01-20 15:47:28.411 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:47:28 np0005588920 nova_compute[226886]: 2026-01-20 15:47:28.411 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:29.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:29.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:30 np0005588920 nova_compute[226886]: 2026-01-20 15:47:30.076 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:31.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:31.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:32 np0005588920 nova_compute[226886]: 2026-01-20 15:47:32.411 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:32 np0005588920 nova_compute[226886]: 2026-01-20 15:47:32.411 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:47:32 np0005588920 nova_compute[226886]: 2026-01-20 15:47:32.411 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:47:32 np0005588920 nova_compute[226886]: 2026-01-20 15:47:32.987 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:33 np0005588920 podman[315264]: 2026-01-20 15:47:33.232043021 +0000 UTC m=+0.059058186 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:47:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:33.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:33.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:33 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:34 np0005588920 nova_compute[226886]: 2026-01-20 15:47:34.551 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:47:34 np0005588920 nova_compute[226886]: 2026-01-20 15:47:34.552 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588920 nova_compute[226886]: 2026-01-20 15:47:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:35 np0005588920 nova_compute[226886]: 2026-01-20 15:47:35.077 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:35 np0005588920 nova_compute[226886]: 2026-01-20 15:47:35.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:35.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:35.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:37 np0005588920 nova_compute[226886]: 2026-01-20 15:47:37.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:37.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:37.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:37 np0005588920 nova_compute[226886]: 2026-01-20 15:47:37.989 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:39 np0005588920 nova_compute[226886]: 2026-01-20 15:47:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:39.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:39.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:40 np0005588920 nova_compute[226886]: 2026-01-20 15:47:40.078 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:41.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:41.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:42 np0005588920 nova_compute[226886]: 2026-01-20 15:47:42.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:43 np0005588920 nova_compute[226886]: 2026-01-20 15:47:43.026 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:43 np0005588920 nova_compute[226886]: 2026-01-20 15:47:43.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:43 np0005588920 nova_compute[226886]: 2026-01-20 15:47:43.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:47:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:43.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:43.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:45 np0005588920 nova_compute[226886]: 2026-01-20 15:47:45.080 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:45 np0005588920 nova_compute[226886]: 2026-01-20 15:47:45.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:45.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:45.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:47.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:48 np0005588920 nova_compute[226886]: 2026-01-20 15:47:48.031 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:49.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:49.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:50 np0005588920 nova_compute[226886]: 2026-01-20 15:47:50.084 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:51.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:51.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:47:52 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 18K writes, 91K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1464 writes, 7258 keys, 1464 commit groups, 1.0 writes per commit group, ingest: 15.22 MB, 0.03 MB/s#012Interval WAL: 1464 writes, 1464 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     70.6      1.58              0.40        60    0.026       0      0       0.0       0.0#012  L6      1/0   11.80 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.5    105.5     90.7      6.75              1.86        59    0.114    474K    31K       0.0       0.0#012 Sum      1/0   11.80 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.5     85.5     86.9      8.33              2.25       119    0.070    474K    31K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.8     88.2     87.9      0.91              0.23        12    0.076     67K   3081       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0    105.5     90.7      6.75              1.86        59    0.114    474K    31K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     70.7      1.58              0.40        59    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.109, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.71 GB write, 0.11 MB/s write, 0.70 GB read, 0.11 MB/s read, 8.3 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564a2f9711f0#2 capacity: 304.00 MB usage: 76.42 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.000507 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4356,73.08 MB,24.0381%) FilterBlock(119,1.27 MB,0.417664%) IndexBlock(119,2.07 MB,0.681179%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:47:53 np0005588920 nova_compute[226886]: 2026-01-20 15:47:53.090 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:53.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:53.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:55 np0005588920 podman[315311]: 2026-01-20 15:47:55.034110634 +0000 UTC m=+0.103070071 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:47:55 np0005588920 nova_compute[226886]: 2026-01-20 15:47:55.086 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:55.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:55.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:57.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:58 np0005588920 nova_compute[226886]: 2026-01-20 15:47:58.094 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:59.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:47:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:59.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:00 np0005588920 nova_compute[226886]: 2026-01-20 15:48:00.088 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:01.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:01.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:03 np0005588920 nova_compute[226886]: 2026-01-20 15:48:03.102 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:03.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:03.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:03 np0005588920 podman[315341]: 2026-01-20 15:48:03.993593845 +0000 UTC m=+0.074825475 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:48:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:05 np0005588920 nova_compute[226886]: 2026-01-20 15:48:05.091 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:05.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:05.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:07.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:07.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:08 np0005588920 nova_compute[226886]: 2026-01-20 15:48:08.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:09.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:09.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:10 np0005588920 nova_compute[226886]: 2026-01-20 15:48:10.092 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:11.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:13 np0005588920 nova_compute[226886]: 2026-01-20 15:48:13.108 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:48:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/576372207' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:48:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:48:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/576372207' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:48:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:13.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:13.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:15 np0005588920 nova_compute[226886]: 2026-01-20 15:48:15.094 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:15.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:15.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:48:16.512 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:48:16.512 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:48:16.513 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:17.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:17.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:18 np0005588920 nova_compute[226886]: 2026-01-20 15:48:18.111 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:19.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:19.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:20 np0005588920 nova_compute[226886]: 2026-01-20 15:48:20.095 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:21.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:21.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:23 np0005588920 nova_compute[226886]: 2026-01-20 15:48:23.113 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:48:23 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 79K writes, 317K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 79K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2654 writes, 10K keys, 2654 commit groups, 1.0 writes per commit group, ingest: 11.22 MB, 0.02 MB/s#012Interval WAL: 2654 writes, 1071 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:48:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:23.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:23.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:25 np0005588920 nova_compute[226886]: 2026-01-20 15:48:25.097 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:25.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:25.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:25 np0005588920 podman[315361]: 2026-01-20 15:48:25.99749563 +0000 UTC m=+0.075964138 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.760 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.760 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:48:27 np0005588920 nova_compute[226886]: 2026-01-20 15:48:27.761 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:48:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:27.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:27.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.117 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:48:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4229310605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.224 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.375 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.377 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4159MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.377 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.377 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.606 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.607 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:48:28 np0005588920 nova_compute[226886]: 2026-01-20 15:48:28.630 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:48:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:48:29 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/281312359' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:48:29 np0005588920 nova_compute[226886]: 2026-01-20 15:48:29.069 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:48:29 np0005588920 nova_compute[226886]: 2026-01-20 15:48:29.076 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:48:29 np0005588920 nova_compute[226886]: 2026-01-20 15:48:29.116 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:48:29 np0005588920 nova_compute[226886]: 2026-01-20 15:48:29.120 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:48:29 np0005588920 nova_compute[226886]: 2026-01-20 15:48:29.121 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:29.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:29.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:30 np0005588920 nova_compute[226886]: 2026-01-20 15:48:30.099 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:31 np0005588920 nova_compute[226886]: 2026-01-20 15:48:31.122 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:31 np0005588920 nova_compute[226886]: 2026-01-20 15:48:31.123 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:48:31 np0005588920 nova_compute[226886]: 2026-01-20 15:48:31.123 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:48:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:31.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:31.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:33 np0005588920 nova_compute[226886]: 2026-01-20 15:48:33.121 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:33.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:33.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:34 np0005588920 nova_compute[226886]: 2026-01-20 15:48:34.347 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:48:34 np0005588920 nova_compute[226886]: 2026-01-20 15:48:34.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:48:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:34 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:48:34 np0005588920 podman[315563]: 2026-01-20 15:48:34.967260203 +0000 UTC m=+0.054219468 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:48:35 np0005588920 nova_compute[226886]: 2026-01-20 15:48:35.101 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:35 np0005588920 nova_compute[226886]: 2026-01-20 15:48:35.788 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:35.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:35.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:36 np0005588920 nova_compute[226886]: 2026-01-20 15:48:36.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:37.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:37.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:38 np0005588920 nova_compute[226886]: 2026-01-20 15:48:38.125 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:38 np0005588920 nova_compute[226886]: 2026-01-20 15:48:38.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:39.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:39.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:40 np0005588920 nova_compute[226886]: 2026-01-20 15:48:40.103 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:40 np0005588920 nova_compute[226886]: 2026-01-20 15:48:40.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:40 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:41.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:43 np0005588920 nova_compute[226886]: 2026-01-20 15:48:43.126 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:43 np0005588920 nova_compute[226886]: 2026-01-20 15:48:43.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:43 np0005588920 nova_compute[226886]: 2026-01-20 15:48:43.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:48:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:43.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:43.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:45 np0005588920 nova_compute[226886]: 2026-01-20 15:48:45.106 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:45.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:46 np0005588920 nova_compute[226886]: 2026-01-20 15:48:46.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:47.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:47.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:48 np0005588920 nova_compute[226886]: 2026-01-20 15:48:48.132 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:49.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:49.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:50 np0005588920 nova_compute[226886]: 2026-01-20 15:48:50.141 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 20 10:48:50 np0005588920 radosgw[83324]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 20 10:48:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:51.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:51.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:53 np0005588920 nova_compute[226886]: 2026-01-20 15:48:53.177 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:53.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:53.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:55 np0005588920 nova_compute[226886]: 2026-01-20 15:48:55.174 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:55.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:55.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:57 np0005588920 podman[315635]: 2026-01-20 15:48:57.005023324 +0000 UTC m=+0.086604581 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:48:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:57.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:57.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:58 np0005588920 nova_compute[226886]: 2026-01-20 15:48:58.180 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:59.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:48:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:59.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:00 np0005588920 nova_compute[226886]: 2026-01-20 15:49:00.175 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:01.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:01.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:03 np0005588920 nova_compute[226886]: 2026-01-20 15:49:03.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.839118) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143839187, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1348, "num_deletes": 251, "total_data_size": 3019922, "memory_usage": 3056920, "flush_reason": "Manual Compaction"}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 20 10:49:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:03.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143862225, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 1982070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91087, "largest_seqno": 92430, "table_properties": {"data_size": 1976282, "index_size": 3118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12420, "raw_average_key_size": 19, "raw_value_size": 1964668, "raw_average_value_size": 3153, "num_data_blocks": 139, "num_entries": 623, "num_filter_entries": 623, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768924029, "oldest_key_time": 1768924029, "file_creation_time": 1768924143, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 23220 microseconds, and 4778 cpu microseconds.
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.862349) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 1982070 bytes OK
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.862392) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864434) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864451) EVENT_LOG_v1 {"time_micros": 1768924143864446, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864470) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 3013583, prev total WAL file size 3013583, number of live WAL files 2.
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.865729) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(1935KB)], [189(11MB)]
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143865800, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 14355441, "oldest_snapshot_seqno": -1}
Jan 20 10:49:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:03.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 10949 keys, 12391265 bytes, temperature: kUnknown
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143960401, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 12391265, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12323333, "index_size": 39467, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27397, "raw_key_size": 290329, "raw_average_key_size": 26, "raw_value_size": 12134350, "raw_average_value_size": 1108, "num_data_blocks": 1490, "num_entries": 10949, "num_filter_entries": 10949, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768924143, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.960655) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 12391265 bytes
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.961909) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.6 rd, 130.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.8 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(13.5) write-amplify(6.3) OK, records in: 11464, records dropped: 515 output_compression: NoCompression
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.961928) EVENT_LOG_v1 {"time_micros": 1768924143961919, "job": 122, "event": "compaction_finished", "compaction_time_micros": 94676, "compaction_time_cpu_micros": 28040, "output_level": 6, "num_output_files": 1, "total_output_size": 12391265, "num_input_records": 11464, "num_output_records": 10949, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143962437, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143964755, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.865565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:05 np0005588920 nova_compute[226886]: 2026-01-20 15:49:05.178 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:05.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:05.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:05 np0005588920 podman[315662]: 2026-01-20 15:49:05.965230144 +0000 UTC m=+0.048455333 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 10:49:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:07.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:07.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:08 np0005588920 nova_compute[226886]: 2026-01-20 15:49:08.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:09.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:09.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:10 np0005588920 nova_compute[226886]: 2026-01-20 15:49:10.179 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:11 np0005588920 nova_compute[226886]: 2026-01-20 15:49:11.232 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:11.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:11.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:13 np0005588920 nova_compute[226886]: 2026-01-20 15:49:13.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:13.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:13.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:15 np0005588920 nova_compute[226886]: 2026-01-20 15:49:15.183 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:15.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:15.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:49:16.513 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:49:16.514 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:49:16.514 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:17.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:17.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:18 np0005588920 nova_compute[226886]: 2026-01-20 15:49:18.193 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:19.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:19.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:20 np0005588920 nova_compute[226886]: 2026-01-20 15:49:20.184 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:21.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:21.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:23 np0005588920 nova_compute[226886]: 2026-01-20 15:49:23.196 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:23.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:23.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:25 np0005588920 nova_compute[226886]: 2026-01-20 15:49:25.186 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:25.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:25.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:27.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:27 np0005588920 podman[315683]: 2026-01-20 15:49:27.973994499 +0000 UTC m=+0.067706492 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 10:49:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:27.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:28 np0005588920 nova_compute[226886]: 2026-01-20 15:49:28.197 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:29 np0005588920 nova_compute[226886]: 2026-01-20 15:49:29.727 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:29 np0005588920 nova_compute[226886]: 2026-01-20 15:49:29.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:49:29 np0005588920 nova_compute[226886]: 2026-01-20 15:49:29.727 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:49:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:29.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:29 np0005588920 nova_compute[226886]: 2026-01-20 15:49:29.930 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:49:29 np0005588920 nova_compute[226886]: 2026-01-20 15:49:29.930 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:29.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.001 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.002 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.002 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.002 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.003 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.188 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:49:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2906927452' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.465 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.601 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.602 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4148MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.602 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.603 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.759 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.759 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.778 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing inventories for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.885 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating ProviderTree inventory for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.885 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Updating inventory in ProviderTree for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.901 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing aggregate associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.923 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Refreshing trait associations for resource provider ff38e91c-3320-4831-90ac-bcffc89ba7b6, traits: COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:49:30 np0005588920 nova_compute[226886]: 2026-01-20 15:49:30.946 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:49:31 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:49:31 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3255063098' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:49:31 np0005588920 nova_compute[226886]: 2026-01-20 15:49:31.355 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:49:31 np0005588920 nova_compute[226886]: 2026-01-20 15:49:31.360 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:49:31 np0005588920 nova_compute[226886]: 2026-01-20 15:49:31.448 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:49:31 np0005588920 nova_compute[226886]: 2026-01-20 15:49:31.450 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:49:31 np0005588920 nova_compute[226886]: 2026-01-20 15:49:31.450 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:31.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:31.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:33 np0005588920 nova_compute[226886]: 2026-01-20 15:49:33.200 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:33.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:33.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:35 np0005588920 nova_compute[226886]: 2026-01-20 15:49:35.191 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:35.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:36.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:36 np0005588920 podman[315754]: 2026-01-20 15:49:36.993721557 +0000 UTC m=+0.077053709 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 20 10:49:37 np0005588920 nova_compute[226886]: 2026-01-20 15:49:37.245 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:37 np0005588920 nova_compute[226886]: 2026-01-20 15:49:37.246 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:37 np0005588920 nova_compute[226886]: 2026-01-20 15:49:37.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:37.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:38 np0005588920 nova_compute[226886]: 2026-01-20 15:49:38.203 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:38.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:39 np0005588920 nova_compute[226886]: 2026-01-20 15:49:39.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:40 np0005588920 nova_compute[226886]: 2026-01-20 15:49:40.192 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:40.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:41.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:49:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:49:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:42.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:42 np0005588920 nova_compute[226886]: 2026-01-20 15:49:42.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:43 np0005588920 nova_compute[226886]: 2026-01-20 15:49:43.206 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:43.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:44.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:45 np0005588920 nova_compute[226886]: 2026-01-20 15:49:45.198 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:45 np0005588920 nova_compute[226886]: 2026-01-20 15:49:45.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:45 np0005588920 nova_compute[226886]: 2026-01-20 15:49:45.747 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:45 np0005588920 nova_compute[226886]: 2026-01-20 15:49:45.747 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:49:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:45.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:46.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:47 np0005588920 nova_compute[226886]: 2026-01-20 15:49:47.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:47.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:48 np0005588920 nova_compute[226886]: 2026-01-20 15:49:48.210 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:48.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:49 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:49.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:50 np0005588920 nova_compute[226886]: 2026-01-20 15:49:50.198 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:50.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:53 np0005588920 nova_compute[226886]: 2026-01-20 15:49:53.212 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:55 np0005588920 nova_compute[226886]: 2026-01-20 15:49:55.199 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:55.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:57.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:58 np0005588920 nova_compute[226886]: 2026-01-20 15:49:58.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:58.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:58 np0005588920 nova_compute[226886]: 2026-01-20 15:49:58.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:59 np0005588920 podman[315955]: 2026-01-20 15:49:59.076216813 +0000 UTC m=+0.157918406 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:49:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:49:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:59.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:00 np0005588920 nova_compute[226886]: 2026-01-20 15:50:00.201 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:00.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:00 np0005588920 ceph-mon[77148]: overall HEALTH_OK
Jan 20 10:50:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:01.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:02.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:03 np0005588920 nova_compute[226886]: 2026-01-20 15:50:03.217 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:03.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:04.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:05 np0005588920 nova_compute[226886]: 2026-01-20 15:50:05.202 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:05.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:06.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:07 np0005588920 podman[315986]: 2026-01-20 15:50:07.979006015 +0000 UTC m=+0.069867384 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:50:08 np0005588920 nova_compute[226886]: 2026-01-20 15:50:08.220 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:08.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:09.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:10 np0005588920 nova_compute[226886]: 2026-01-20 15:50:10.204 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:10.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:11.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:12.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:13 np0005588920 nova_compute[226886]: 2026-01-20 15:50:13.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:13.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:14.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:15 np0005588920 nova_compute[226886]: 2026-01-20 15:50:15.207 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:15.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:50:16.515 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:50:16.515 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:50:16.515 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:16.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:17 np0005588920 nova_compute[226886]: 2026-01-20 15:50:17.740 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:17 np0005588920 nova_compute[226886]: 2026-01-20 15:50:17.741 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:50:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:17.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:18 np0005588920 nova_compute[226886]: 2026-01-20 15:50:18.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:18.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:19 np0005588920 nova_compute[226886]: 2026-01-20 15:50:19.744 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:19 np0005588920 nova_compute[226886]: 2026-01-20 15:50:19.745 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:50:19 np0005588920 nova_compute[226886]: 2026-01-20 15:50:19.772 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:50:19 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:19 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:19 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:19.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:20 np0005588920 nova_compute[226886]: 2026-01-20 15:50:20.209 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:20.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:21 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:21 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:21 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:21.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:22.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:23 np0005588920 nova_compute[226886]: 2026-01-20 15:50:23.229 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:23 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:23 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:23 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:23.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:24.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:25 np0005588920 nova_compute[226886]: 2026-01-20 15:50:25.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:25 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:25 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:25 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:25.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:26.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:27 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:27 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:27 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:27.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:28 np0005588920 nova_compute[226886]: 2026-01-20 15:50:28.232 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:28.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.753 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.754 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.754 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.789 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.789 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.828 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.829 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.829 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.829 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:50:29 np0005588920 nova_compute[226886]: 2026-01-20 15:50:29.829 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:50:29 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:29 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:29 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:29.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:30 np0005588920 podman[316006]: 2026-01-20 15:50:30.014016775 +0000 UTC m=+0.098109679 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.211 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:50:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3445212458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.298 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.449 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.450 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4150MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.450 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.450 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.557 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.558 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:50:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:30.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:30 np0005588920 nova_compute[226886]: 2026-01-20 15:50:30.601 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:50:30 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:50:30 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2651699864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:50:31 np0005588920 nova_compute[226886]: 2026-01-20 15:50:31.016 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:50:31 np0005588920 nova_compute[226886]: 2026-01-20 15:50:31.022 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:50:31 np0005588920 nova_compute[226886]: 2026-01-20 15:50:31.039 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:50:31 np0005588920 nova_compute[226886]: 2026-01-20 15:50:31.041 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:50:31 np0005588920 nova_compute[226886]: 2026-01-20 15:50:31.041 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:31 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:31 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:31 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:31.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:32.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:33 np0005588920 nova_compute[226886]: 2026-01-20 15:50:33.267 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:33 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:33 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:33 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:33.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:34.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:35 np0005588920 nova_compute[226886]: 2026-01-20 15:50:35.214 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:35 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:35 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:35 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:36.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:37 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:37 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:37 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:37.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:38 np0005588920 nova_compute[226886]: 2026-01-20 15:50:38.270 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:38.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:38 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:50:38 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1285300892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:50:38 np0005588920 podman[316075]: 2026-01-20 15:50:38.974863446 +0000 UTC m=+0.061736662 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:50:38 np0005588920 nova_compute[226886]: 2026-01-20 15:50:38.977 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:38 np0005588920 nova_compute[226886]: 2026-01-20 15:50:38.977 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:38 np0005588920 nova_compute[226886]: 2026-01-20 15:50:38.978 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:39 np0005588920 nova_compute[226886]: 2026-01-20 15:50:39.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:39 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:39 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:39 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:39.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:40 np0005588920 nova_compute[226886]: 2026-01-20 15:50:40.216 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:40.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:41 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:41 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:41 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:41.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:42.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:42 np0005588920 nova_compute[226886]: 2026-01-20 15:50:42.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:43 np0005588920 nova_compute[226886]: 2026-01-20 15:50:43.320 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:43 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:43 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:43 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:43.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:44.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:45 np0005588920 nova_compute[226886]: 2026-01-20 15:50:45.219 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:45 np0005588920 nova_compute[226886]: 2026-01-20 15:50:45.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:45 np0005588920 nova_compute[226886]: 2026-01-20 15:50:45.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:50:45 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:45 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:45 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:45.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:46.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:47 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:47 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:47 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:47.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:48 np0005588920 nova_compute[226886]: 2026-01-20 15:50:48.339 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:48.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:48 np0005588920 nova_compute[226886]: 2026-01-20 15:50:48.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:49 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:49 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:49 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:49.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:50:50 np0005588920 nova_compute[226886]: 2026-01-20 15:50:50.220 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:50.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:50:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:50:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:51 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:50:51 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:51 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:51 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:51.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:52.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:53 np0005588920 nova_compute[226886]: 2026-01-20 15:50:53.377 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:53 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:53 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:53 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:53.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:54.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:55 np0005588920 nova_compute[226886]: 2026-01-20 15:50:55.222 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:55 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:55 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:55 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:55.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:56.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:56 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:57 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:57 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:57 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:57.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:58 np0005588920 nova_compute[226886]: 2026-01-20 15:50:58.410 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:58.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:59 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:50:59 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:59 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:59.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:00 np0005588920 nova_compute[226886]: 2026-01-20 15:51:00.225 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:00.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:01 np0005588920 podman[316399]: 2026-01-20 15:51:01.063896827 +0000 UTC m=+0.142779033 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:51:01 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:01 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:01 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:01.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:01 np0005588920 nova_compute[226886]: 2026-01-20 15:51:01.997 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:02.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:03 np0005588920 nova_compute[226886]: 2026-01-20 15:51:03.414 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:03 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:03 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:03 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:03.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:04.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:05 np0005588920 nova_compute[226886]: 2026-01-20 15:51:05.226 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:05 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:05 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:05 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:05.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:06.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:07 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:07 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:07 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:07.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:08 np0005588920 nova_compute[226886]: 2026-01-20 15:51:08.454 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:08.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:09 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:09 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:09 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:09.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:10 np0005588920 podman[316425]: 2026-01-20 15:51:10.028394101 +0000 UTC m=+0.102996879 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:51:10 np0005588920 nova_compute[226886]: 2026-01-20 15:51:10.227 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:10.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:11 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:11 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:11 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:11.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:12.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:13 np0005588920 nova_compute[226886]: 2026-01-20 15:51:13.510 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:13 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:13 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:13 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:13.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:14.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:15 np0005588920 nova_compute[226886]: 2026-01-20 15:51:15.230 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:15 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:15 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:15 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:15.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:51:16.517 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:51:16.517 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:51:16.517 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:16.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:17 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:17 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:17 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:17.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:18 np0005588920 nova_compute[226886]: 2026-01-20 15:51:18.513 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:18.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:20.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:20 np0005588920 nova_compute[226886]: 2026-01-20 15:51:20.234 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:20.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:22.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:22.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:23 np0005588920 nova_compute[226886]: 2026-01-20 15:51:23.545 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:24.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:24.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:25 np0005588920 nova_compute[226886]: 2026-01-20 15:51:25.237 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:26.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:26.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:28.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:28 np0005588920 nova_compute[226886]: 2026-01-20 15:51:28.550 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:28.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:30.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:30 np0005588920 nova_compute[226886]: 2026-01-20 15:51:30.239 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:30.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.725 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.744 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.744 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.779 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.780 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.780 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.780 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:51:31 np0005588920 nova_compute[226886]: 2026-01-20 15:51:31.780 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:51:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:32.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:32 np0005588920 podman[316447]: 2026-01-20 15:51:32.033328694 +0000 UTC m=+0.109995719 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:51:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:51:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3335256768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.291 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.500 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.501 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4140MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.501 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.502 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.568 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.568 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:51:32 np0005588920 nova_compute[226886]: 2026-01-20 15:51:32.581 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:51:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:32.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:51:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/814707366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.025 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.031 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.046 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.048 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.048 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:33 np0005588920 nova_compute[226886]: 2026-01-20 15:51:33.553 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:34.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:34.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:35 np0005588920 nova_compute[226886]: 2026-01-20 15:51:35.241 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:36.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:36.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:38.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:38 np0005588920 nova_compute[226886]: 2026-01-20 15:51:38.556 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:38.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:40.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:40 np0005588920 nova_compute[226886]: 2026-01-20 15:51:40.029 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:40 np0005588920 nova_compute[226886]: 2026-01-20 15:51:40.030 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:40 np0005588920 nova_compute[226886]: 2026-01-20 15:51:40.243 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:40.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:40 np0005588920 nova_compute[226886]: 2026-01-20 15:51:40.722 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:40 np0005588920 nova_compute[226886]: 2026-01-20 15:51:40.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:40 np0005588920 podman[316517]: 2026-01-20 15:51:40.994920684 +0000 UTC m=+0.074937119 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 20 10:51:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:42.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:42.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:43 np0005588920 nova_compute[226886]: 2026-01-20 15:51:43.559 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:44.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:44.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:44 np0005588920 nova_compute[226886]: 2026-01-20 15:51:44.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:45 np0005588920 nova_compute[226886]: 2026-01-20 15:51:45.245 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:45 np0005588920 nova_compute[226886]: 2026-01-20 15:51:45.720 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:46.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:46.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:47 np0005588920 nova_compute[226886]: 2026-01-20 15:51:47.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:47 np0005588920 nova_compute[226886]: 2026-01-20 15:51:47.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:51:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:48.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:48 np0005588920 nova_compute[226886]: 2026-01-20 15:51:48.562 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:48.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:48 np0005588920 nova_compute[226886]: 2026-01-20 15:51:48.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:50.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:50 np0005588920 nova_compute[226886]: 2026-01-20 15:51:50.246 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:52.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:52.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:53 np0005588920 nova_compute[226886]: 2026-01-20 15:51:53.566 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:54.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:54.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:55 np0005588920 nova_compute[226886]: 2026-01-20 15:51:55.248 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:56.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:56.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:58.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:58 np0005588920 nova_compute[226886]: 2026-01-20 15:51:58.570 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:51:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:58.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:51:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:51:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:51:58 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:51:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:00.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:00 np0005588920 nova_compute[226886]: 2026-01-20 15:52:00.251 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:00.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:02.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:02.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:03 np0005588920 podman[316667]: 2026-01-20 15:52:03.011259911 +0000 UTC m=+0.095175785 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:52:03 np0005588920 nova_compute[226886]: 2026-01-20 15:52:03.581 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:04.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:04.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:52:04 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:52:05 np0005588920 nova_compute[226886]: 2026-01-20 15:52:05.253 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:06.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:06.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:52:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:08.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:52:08 np0005588920 nova_compute[226886]: 2026-01-20 15:52:08.584 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:08.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:10.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:10 np0005588920 nova_compute[226886]: 2026-01-20 15:52:10.254 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:10.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:11 np0005588920 podman[316746]: 2026-01-20 15:52:11.988980502 +0000 UTC m=+0.070140122 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:52:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:12.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:12.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:13 np0005588920 nova_compute[226886]: 2026-01-20 15:52:13.587 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:14.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:14.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:15 np0005588920 nova_compute[226886]: 2026-01-20 15:52:15.256 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:16.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:52:16.518 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:52:16.519 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:52:16.519 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:16.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:18.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:18 np0005588920 nova_compute[226886]: 2026-01-20 15:52:18.590 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:18.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:20.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:20 np0005588920 nova_compute[226886]: 2026-01-20 15:52:20.258 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:20.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:22.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:22.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:23 np0005588920 nova_compute[226886]: 2026-01-20 15:52:23.594 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:24.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:24.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:25 np0005588920 nova_compute[226886]: 2026-01-20 15:52:25.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:26.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:28.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:28 np0005588920 nova_compute[226886]: 2026-01-20 15:52:28.598 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:28.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:29 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:30.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:30 np0005588920 nova_compute[226886]: 2026-01-20 15:52:30.260 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:30 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:30 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:30 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:30.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.762 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.762 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.762 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.762 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:52:31 np0005588920 nova_compute[226886]: 2026-01-20 15:52:31.762 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:52:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:32.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:32 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:52:32 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2680723541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.214 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.370 226890 WARNING nova.virt.libvirt.driver [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.372 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4129MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.372 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.372 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.441 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.442 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:52:32 np0005588920 nova_compute[226886]: 2026-01-20 15:52:32.599 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:52:32 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:32 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:32 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:32.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:33 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:52:33 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3402365577' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.113 226890 DEBUG oslo_concurrency.processutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.121 226890 DEBUG nova.compute.provider_tree [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed in ProviderTree for provider: ff38e91c-3320-4831-90ac-bcffc89ba7b6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.144 226890 DEBUG nova.scheduler.client.report [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Inventory has not changed for provider ff38e91c-3320-4831-90ac-bcffc89ba7b6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.145 226890 DEBUG nova.compute.resource_tracker [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.146 226890 DEBUG oslo_concurrency.lockutils [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:33 np0005588920 nova_compute[226886]: 2026-01-20 15:52:33.600 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:33 np0005588920 podman[316809]: 2026-01-20 15:52:33.992178696 +0000 UTC m=+0.082474093 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:52:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:34.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:34 np0005588920 nova_compute[226886]: 2026-01-20 15:52:34.145 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:34 np0005588920 nova_compute[226886]: 2026-01-20 15:52:34.146 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:52:34 np0005588920 nova_compute[226886]: 2026-01-20 15:52:34.146 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:52:34 np0005588920 nova_compute[226886]: 2026-01-20 15:52:34.164 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:52:34 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:34 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:34 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:34 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:34.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:35 np0005588920 nova_compute[226886]: 2026-01-20 15:52:35.297 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:36.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:36 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:36 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:36 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:36.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:38.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:38 np0005588920 nova_compute[226886]: 2026-01-20 15:52:38.604 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:38 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:38 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:38 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:38.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:39 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:39 np0005588920 nova_compute[226886]: 2026-01-20 15:52:39.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:40 np0005588920 nova_compute[226886]: 2026-01-20 15:52:40.298 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:40 np0005588920 nova_compute[226886]: 2026-01-20 15:52:40.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:40 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:40 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:40 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:40.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:42 np0005588920 nova_compute[226886]: 2026-01-20 15:52:42.721 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:42 np0005588920 nova_compute[226886]: 2026-01-20 15:52:42.724 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:42 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:42 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:42 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:42.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:43 np0005588920 podman[316836]: 2026-01-20 15:52:43.002481077 +0000 UTC m=+0.076594115 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:52:43 np0005588920 nova_compute[226886]: 2026-01-20 15:52:43.608 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:44.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.459086) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364459143, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2347, "num_deletes": 251, "total_data_size": 5826467, "memory_usage": 5895088, "flush_reason": "Manual Compaction"}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364499463, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3812952, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 92435, "largest_seqno": 94777, "table_properties": {"data_size": 3803442, "index_size": 6003, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19248, "raw_average_key_size": 20, "raw_value_size": 3784588, "raw_average_value_size": 3992, "num_data_blocks": 262, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768924145, "oldest_key_time": 1768924145, "file_creation_time": 1768924364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 40429 microseconds, and 11491 cpu microseconds.
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.499515) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3812952 bytes OK
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.499537) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.501819) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.501870) EVENT_LOG_v1 {"time_micros": 1768924364501859, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.501899) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5816207, prev total WAL file size 5816207, number of live WAL files 2.
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.504262) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3723KB)], [192(11MB)]
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364504303, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 16204217, "oldest_snapshot_seqno": -1}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11378 keys, 14207606 bytes, temperature: kUnknown
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364668935, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 14207606, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14135258, "index_size": 42795, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28485, "raw_key_size": 299899, "raw_average_key_size": 26, "raw_value_size": 13937284, "raw_average_value_size": 1224, "num_data_blocks": 1628, "num_entries": 11378, "num_filter_entries": 11378, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917472, "oldest_key_time": 0, "file_creation_time": 1768924364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3345eece-ed87-47a3-81a4-4a6b71655d31", "db_session_id": "2UAGHK3PX46HCRM7QXU4", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.669256) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 14207606 bytes
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.670942) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 98.4 rd, 86.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.8 +0.0 blob) out(13.5 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 11897, records dropped: 519 output_compression: NoCompression
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.670956) EVENT_LOG_v1 {"time_micros": 1768924364670949, "job": 124, "event": "compaction_finished", "compaction_time_micros": 164692, "compaction_time_cpu_micros": 57124, "output_level": 6, "num_output_files": 1, "total_output_size": 14207606, "num_input_records": 11897, "num_output_records": 11378, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364671729, "job": 124, "event": "table_file_deletion", "file_number": 194}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364673843, "job": 124, "event": "table_file_deletion", "file_number": 192}
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.504138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.673950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.673958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.673961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.673964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 ceph-mon[77148]: rocksdb: (Original Log Time 2026/01/20-15:52:44.673967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:44 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:44 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:44.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:45 np0005588920 nova_compute[226886]: 2026-01-20 15:52:45.299 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:45 np0005588920 nova_compute[226886]: 2026-01-20 15:52:45.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:46.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:46 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:46 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:46 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:46.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:47 np0005588920 nova_compute[226886]: 2026-01-20 15:52:47.725 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:47 np0005588920 nova_compute[226886]: 2026-01-20 15:52:47.726 226890 DEBUG nova.compute.manager [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:52:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:48.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:48 np0005588920 nova_compute[226886]: 2026-01-20 15:52:48.612 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:48 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:48 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:48 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:48.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:49 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:50 np0005588920 nova_compute[226886]: 2026-01-20 15:52:50.302 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:50 np0005588920 nova_compute[226886]: 2026-01-20 15:52:50.726 226890 DEBUG oslo_service.periodic_task [None req-17081a2b-ea44-416e-8009-79ec2b746ed6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:50 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:50 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:50 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:50.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:52 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:52 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:52 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:52.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:53 np0005588920 nova_compute[226886]: 2026-01-20 15:52:53.615 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:54.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:54 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:54 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:54 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:54 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:54.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:55 np0005588920 nova_compute[226886]: 2026-01-20 15:52:55.305 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:56.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:56 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:56 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:56 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:56.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:58.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:58 np0005588920 nova_compute[226886]: 2026-01-20 15:52:58.620 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:58 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:52:58 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:58 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:58.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:59 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:00 np0005588920 nova_compute[226886]: 2026-01-20 15:53:00.307 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:00 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:00 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:00 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:00.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:02.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:02 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:02 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:02 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:02.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:03 np0005588920 systemd-logind[783]: New session 60 of user zuul.
Jan 20 10:53:03 np0005588920 systemd[1]: Started Session 60 of User zuul.
Jan 20 10:53:03 np0005588920 nova_compute[226886]: 2026-01-20 15:53:03.621 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:04.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:04 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:04 np0005588920 podman[316919]: 2026-01-20 15:53:04.517024402 +0000 UTC m=+0.150144654 container health_status 29fd8016df067555c2beca3a6a22d9a2f4a25504a0674b6f7a9f82b021a199bc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:53:04 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:04 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:04 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:04.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:05 np0005588920 podman[317156]: 2026-01-20 15:53:05.1602797 +0000 UTC m=+0.070167073 container exec 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 10:53:05 np0005588920 podman[317156]: 2026-01-20 15:53:05.265647135 +0000 UTC m=+0.175534488 container exec_died 6d4eaf8659f13902200468a4ae46f22a0824452b9353ff1491898f5d3b0130c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef)
Jan 20 10:53:05 np0005588920 nova_compute[226886]: 2026-01-20 15:53:05.309 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:06.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:06 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:06 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:06 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:06 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:06.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:06 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 20 10:53:06 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3700717351' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 20 10:53:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:08.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:53:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:08 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:53:08 np0005588920 nova_compute[226886]: 2026-01-20 15:53:08.669 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:08 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:08 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:08 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:08.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:09 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:09 np0005588920 ovs-vsctl[317592]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 20 10:53:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:10 np0005588920 nova_compute[226886]: 2026-01-20 15:53:10.309 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:10 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:10 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:10 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:10.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:10 np0005588920 virtqemud[226436]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 20 10:53:10 np0005588920 virtqemud[226436]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 20 10:53:11 np0005588920 virtqemud[226436]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 20 10:53:11 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: cache status {prefix=cache status} (starting...)
Jan 20 10:53:11 np0005588920 lvm[317924]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 10:53:11 np0005588920 lvm[317924]: VG ceph_vg0 finished
Jan 20 10:53:11 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: client ls {prefix=client ls} (starting...)
Jan 20 10:53:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:12.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: damage ls {prefix=damage ls} (starting...)
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump loads {prefix=dump loads} (starting...)
Jan 20 10:53:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 20 10:53:12 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4190772805' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 20 10:53:12 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:12 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:12 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:12.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 20 10:53:12 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 20 10:53:12 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1906101247' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 20 10:53:12 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 20 10:53:13 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/713696760' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 20 10:53:13 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 20 10:53:13 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: ops {prefix=ops} (starting...)
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4057764480' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 20 10:53:13 np0005588920 nova_compute[226886]: 2026-01-20 15:53:13.671 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1050385293' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1050385293' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1706995726' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:13 np0005588920 ceph-mon[77148]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:14 np0005588920 podman[318259]: 2026-01-20 15:53:14.005000527 +0000 UTC m=+0.081548447 container health_status 04596e5beba505d4224673cb35c5d5496c847a64018f2b0d4273154ed3fe4882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-b64204a0cae6cdfc9699c0e376b26d8bf3abd80364f7d47f9b5f1870b84d6499-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 10:53:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 20 10:53:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2698181100' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 20 10:53:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:14.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:14 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: session ls {prefix=session ls} (starting...)
Jan 20 10:53:14 np0005588920 ceph-mds[83715]: mds.cephfs.compute-2.jyxktq asok_command: status {prefix=status} (starting...)
Jan 20 10:53:14 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 20 10:53:14 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2058637716' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 20 10:53:14 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:14 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:14 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:14.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1991284178' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 20 10:53:15 np0005588920 nova_compute[226886]: 2026-01-20 15:53:15.310 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3585713236' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2989774423' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 20 10:53:15 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1406338621' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 20 10:53:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:16.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1611874348' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/878803266' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 20 10:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:53:16.519 144128 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:53:16.519 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:53:16 np0005588920 ovn_metadata_agent[144123]: 2026-01-20 15:53:16.520 144128 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 20 10:53:16 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/692267894' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 20 10:53:16 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:16 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:16 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:16.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2103643461' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2779943835' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 20 10:53:17 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2247268192' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521756672 unmapped: 67452928 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521756672 unmapped: 67452928 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521756672 unmapped: 67452928 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.113175392s of 13.902588844s, submitted: 313
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521781248 unmapped: 67428352 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 heartbeat osd_stat(store_statfs(0x199471000/0x0/0x1bfc00000, data 0x43d2adb/0x45fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521781248 unmapped: 67428352 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5637923 data_alloc: 251658240 data_used: 33759232
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 ms_handle_reset con 0x562f8b70f400 session 0x562f8900eb40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 ms_handle_reset con 0x562f8b710800 session 0x562f8967d680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521789440 unmapped: 67420160 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521789440 unmapped: 67420160 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521797632 unmapped: 67411968 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 heartbeat osd_stat(store_statfs(0x199471000/0x0/0x1bfc00000, data 0x43d2adb/0x45fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 heartbeat osd_stat(store_statfs(0x199471000/0x0/0x1bfc00000, data 0x43d2adb/0x45fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521797632 unmapped: 67411968 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521797632 unmapped: 67411968 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5633127 data_alloc: 251658240 data_used: 33759232
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521797632 unmapped: 67411968 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521797632 unmapped: 67411968 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521805824 unmapped: 67403776 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.978057861s of 10.021944046s, submitted: 28
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 heartbeat osd_stat(store_statfs(0x199471000/0x0/0x1bfc00000, data 0x43d2adb/0x45fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 ms_handle_reset con 0x562f8bb2ac00 session 0x562f8bc09c20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f8b70fc00 session 0x562f8b7c3c20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521814016 unmapped: 67395584 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f8df52c00 session 0x562f8a592b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f89d2e000 session 0x562f8c202b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521814016 unmapped: 67395584 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640939 data_alloc: 251658240 data_used: 33771520
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f8b70f400 session 0x562f8c5645a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 heartbeat osd_stat(store_statfs(0x19946c000/0x0/0x1bfc00000, data 0x43d4744/0x4601000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f8a2f8800 session 0x562f8a8a14a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5642167 data_alloc: 251658240 data_used: 34025472
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 ms_handle_reset con 0x562f8bb2b000 session 0x562f8ed2ba40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521822208 unmapped: 67387392 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 heartbeat osd_stat(store_statfs(0x19aa27000/0x0/0x1bfc00000, data 0x2e166bf/0x3041000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521830400 unmapped: 67379200 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f8dfc9000 session 0x562f8ed2b4a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521838592 unmapped: 67371008 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521838592 unmapped: 67371008 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 heartbeat osd_stat(store_statfs(0x19aa29000/0x0/0x1bfc00000, data 0x2e1836c/0x3044000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.846511841s of 11.424500465s, submitted: 60
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f8b70d800 session 0x562f8c2352c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f89d2c400 session 0x562f8a8a03c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521863168 unmapped: 67346432 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5401782 data_alloc: 234881024 data_used: 23015424
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f8a8a6000 session 0x562f8a3914a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 heartbeat osd_stat(store_statfs(0x19aa2a000/0x0/0x1bfc00000, data 0x2e1835c/0x3043000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521863168 unmapped: 67346432 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f963cac00 session 0x562f8c564b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521863168 unmapped: 67346432 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f8e936c00 session 0x562f8f7f3e00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 ms_handle_reset con 0x562f8a56fc00 session 0x562f89687680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 heartbeat osd_stat(store_statfs(0x19aa2b000/0x0/0x1bfc00000, data 0x2e1835c/0x3043000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5420302 data_alloc: 234881024 data_used: 24420352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 heartbeat osd_stat(store_statfs(0x19aa2b000/0x0/0x1bfc00000, data 0x2e1835c/0x3043000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521871360 unmapped: 67338240 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 heartbeat osd_stat(store_statfs(0x19aa2b000/0x0/0x1bfc00000, data 0x2e1835c/0x3043000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521879552 unmapped: 67330048 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 522928128 unmapped: 66281472 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5421292 data_alloc: 234881024 data_used: 24424448
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 522928128 unmapped: 66281472 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 522928128 unmapped: 66281472 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.803892136s of 12.894905090s, submitted: 56
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70fc00 session 0x562f8bb392c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710800 session 0x562f8c3bde00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510795776 unmapped: 78413824 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19aa28000/0x0/0x1bfc00000, data 0x2e19e9b/0x3046000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [0,0,1])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70c000 session 0x562f89bf81e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015712 data_alloc: 218103808 data_used: 2912256
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015712 data_alloc: 218103808 data_used: 2912256
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015872 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015872 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015872 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb6f000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015872 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.509115219s of 28.711914062s, submitted: 53
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510812160 unmapped: 78397440 heap: 589209600 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e472c00 session 0x562f89c445a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510828544 unmapped: 90447872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510828544 unmapped: 90447872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5104700 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x189ee16/0x1ac9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510836736 unmapped: 90439680 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x189ee16/0x1ac9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 510844928 unmapped: 90431488 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5111366 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e936c00 session 0x562f89af2780
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511016960 unmapped: 90259456 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511016960 unmapped: 90259456 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511016960 unmapped: 90259456 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bf80000/0x0/0x1bfc00000, data 0x18c2e39/0x1aee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5199042 data_alloc: 234881024 data_used: 15290368
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bf80000/0x0/0x1bfc00000, data 0x18c2e39/0x1aee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511565824 unmapped: 89710592 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bf80000/0x0/0x1bfc00000, data 0x18c2e39/0x1aee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bf80000/0x0/0x1bfc00000, data 0x18c2e39/0x1aee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511574016 unmapped: 89702400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5199042 data_alloc: 234881024 data_used: 15290368
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511574016 unmapped: 89702400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511574016 unmapped: 89702400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511574016 unmapped: 89702400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19bf80000/0x0/0x1bfc00000, data 0x18c2e39/0x1aee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511574016 unmapped: 89702400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.296699524s of 23.424533844s, submitted: 21
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5265662 data_alloc: 234881024 data_used: 15900672
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b985000/0x0/0x1bfc00000, data 0x1ebde39/0x20e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5268632 data_alloc: 234881024 data_used: 15966208
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976c00 session 0x562f8b694000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.949796677s of 10.103276253s, submitted: 54
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f94c53800 session 0x562f89b70d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5268500 data_alloc: 234881024 data_used: 15966208
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5268500 data_alloc: 234881024 data_used: 15966208
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e20000 session 0x562f8e186f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514310144 unmapped: 86966272 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5265448 data_alloc: 234881024 data_used: 15986688
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5265448 data_alloc: 234881024 data_used: 15986688
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5269768 data_alloc: 234881024 data_used: 16355328
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 86949888 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5269768 data_alloc: 234881024 data_used: 16355328
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b759000/0x0/0x1bfc00000, data 0x20e9e39/0x2315000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8c564000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.983682632s of 29.022367477s, submitted: 2
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f8c564d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514334720 unmapped: 86941696 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5269636 data_alloc: 234881024 data_used: 16355328
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70f800 session 0x562f8c03a5a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5025750 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5025750 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5025750 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506961920 unmapped: 94314496 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5025750 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506970112 unmapped: 94306304 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506978304 unmapped: 94298112 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5025750 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19cb70000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cd800 session 0x562f89c44b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a89e000 session 0x562f8be88960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8bb390e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f89687680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.276323318s of 27.673217773s, submitted: 34
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 506986496 unmapped: 94289920 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cd800 session 0x562f8a593680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70f800 session 0x562f89c44d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90b99c00 session 0x562f8a3523c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8c3e41e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f8d286960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5074441 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f8c3e50e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963cb800 session 0x562f8bc08f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507322368 unmapped: 93954048 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8951e400 session 0x562f8b7a41e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8f7f25a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507281408 unmapped: 93995008 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507289600 unmapped: 93986816 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5094214 data_alloc: 218103808 data_used: 5652480
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5104454 data_alloc: 218103808 data_used: 7045120
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5104454 data_alloc: 218103808 data_used: 7045120
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19c721000/0x0/0x1bfc00000, data 0x1120e88/0x134d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2218f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 507297792 unmapped: 93978624 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.610776901s of 20.730430603s, submitted: 44
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513081344 unmapped: 88195072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513081344 unmapped: 88195072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a8a1000/0x0/0x1bfc00000, data 0x1e00e88/0x202d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5222298 data_alloc: 218103808 data_used: 8384512
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a8a1000/0x0/0x1bfc00000, data 0x1e00e88/0x202d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5217374 data_alloc: 218103808 data_used: 8388608
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f89c443c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f8c202960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5217170 data_alloc: 218103808 data_used: 8388608
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513630208 unmapped: 87646208 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.061242104s of 15.429950714s, submitted: 142
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513638400 unmapped: 87638016 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513638400 unmapped: 87638016 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513638400 unmapped: 87638016 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a87d000/0x0/0x1bfc00000, data 0x1e24e88/0x2051000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d91c00 session 0x562f8bb385a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2f800 session 0x562f8f7f34a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513622016 unmapped: 87654400 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5216466 data_alloc: 218103808 data_used: 8388608
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f89b70b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511811584 unmapped: 89464832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511819776 unmapped: 89456640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511819776 unmapped: 89456640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511819776 unmapped: 89456640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511819776 unmapped: 89456640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511827968 unmapped: 89448448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511836160 unmapped: 89440256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511844352 unmapped: 89432064 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511852544 unmapped: 89423872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5041950 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511852544 unmapped: 89423872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511852544 unmapped: 89423872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 46.220451355s of 46.318683624s, submitted: 42
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511852544 unmapped: 89423872 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b9cb000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [0,0,0,2])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c7d2c00 session 0x562f8d2870e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f8f7f2d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e23800 session 0x562f8bb39a40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70cc00 session 0x562f8c03be00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8c3bd4a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3cb000/0x0/0x1bfc00000, data 0x12d8e16/0x1503000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5096345 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3cb000/0x0/0x1bfc00000, data 0x12d8e16/0x1503000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d5000 session 0x562f8ed2b4a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512114688 unmapped: 89161728 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bfc00 session 0x562f8bb39a40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3c5c00 session 0x562f8f7f2d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f89b70b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512024576 unmapped: 89251840 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5098800 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512032768 unmapped: 89243648 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5135384 data_alloc: 218103808 data_used: 8007680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512049152 unmapped: 89227264 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512057344 unmapped: 89219072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512057344 unmapped: 89219072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5135384 data_alloc: 218103808 data_used: 8007680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512057344 unmapped: 89219072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512057344 unmapped: 89219072 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.673820496s of 19.795513153s, submitted: 35
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516939776 unmapped: 84336640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b3ca000/0x0/0x1bfc00000, data 0x12d8e26/0x1504000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a88c000/0x0/0x1bfc00000, data 0x1e16e26/0x2042000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [1])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5234996 data_alloc: 218103808 data_used: 9166848
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85e000/0x0/0x1bfc00000, data 0x1e43e26/0x206f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5231716 data_alloc: 218103808 data_used: 9166848
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85c000/0x0/0x1bfc00000, data 0x1e46e26/0x2072000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5232516 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85c000/0x0/0x1bfc00000, data 0x1e46e26/0x2072000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516907008 unmapped: 84369408 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.459045410s of 13.756692886s, submitted: 106
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85b000/0x0/0x1bfc00000, data 0x1e47e26/0x2073000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5232744 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85b000/0x0/0x1bfc00000, data 0x1e47e26/0x2073000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85b000/0x0/0x1bfc00000, data 0x1e47e26/0x2073000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5232744 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516915200 unmapped: 84361216 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516923392 unmapped: 84353024 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516923392 unmapped: 84353024 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516923392 unmapped: 84353024 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85b000/0x0/0x1bfc00000, data 0x1e47e26/0x2073000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516931584 unmapped: 84344832 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5232744 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516939776 unmapped: 84336640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516939776 unmapped: 84336640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516939776 unmapped: 84336640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516939776 unmapped: 84336640 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a85b000/0x0/0x1bfc00000, data 0x1e47e26/0x2073000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.199949265s of 18.204195023s, submitted: 1
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5232876 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516947968 unmapped: 84328448 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516956160 unmapped: 84320256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a859000/0x0/0x1bfc00000, data 0x1e48e26/0x2074000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516956160 unmapped: 84320256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abc800 session 0x562f8c03b0e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8fa34000 session 0x562f89d1bc20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abd800 session 0x562f8bc08d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 516956160 unmapped: 84320256 heap: 601276416 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41d000 session 0x562f8a558960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963cb400 session 0x562f8a558f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41d000 session 0x562f8b695860
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8fa34000 session 0x562f8b7c3860
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abc800 session 0x562f8be892c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abd800 session 0x562f8bc08000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5314074 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e97000/0x0/0x1bfc00000, data 0x280ae88/0x2a37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5314074 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e97000/0x0/0x1bfc00000, data 0x280ae88/0x2a37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517054464 unmapped: 91570176 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e97000/0x0/0x1bfc00000, data 0x280ae88/0x2a37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e936800 session 0x562f899630e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e97000/0x0/0x1bfc00000, data 0x280ae88/0x2a37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517021696 unmapped: 91602944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5314074 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41d000 session 0x562f8967da40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517021696 unmapped: 91602944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517021696 unmapped: 91602944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e936400 session 0x562f8b7a4960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.695632935s of 17.816280365s, submitted: 47
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70e800 session 0x562f8c03bc20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517177344 unmapped: 91447296 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517177344 unmapped: 91447296 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e71000/0x0/0x1bfc00000, data 0x282eebb/0x2a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 517177344 unmapped: 91447296 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5320898 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e71000/0x0/0x1bfc00000, data 0x282eebb/0x2a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518455296 unmapped: 90169344 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518455296 unmapped: 90169344 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518455296 unmapped: 90169344 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518455296 unmapped: 90169344 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518455296 unmapped: 90169344 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5392258 data_alloc: 234881024 data_used: 19177472
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e71000/0x0/0x1bfc00000, data 0x282eebb/0x2a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.073493004s of 11.112250328s, submitted: 14
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e71000/0x0/0x1bfc00000, data 0x282eebb/0x2a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5392258 data_alloc: 234881024 data_used: 19177472
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e71000/0x0/0x1bfc00000, data 0x282eebb/0x2a5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2332f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518463488 unmapped: 90161152 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521166848 unmapped: 87457792 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199133000/0x0/0x1bfc00000, data 0x3154ebb/0x3383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990b5000/0x0/0x1bfc00000, data 0x31d2ebb/0x3401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5484314 data_alloc: 234881024 data_used: 19664896
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990b5000/0x0/0x1bfc00000, data 0x31d2ebb/0x3401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990b5000/0x0/0x1bfc00000, data 0x31d2ebb/0x3401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521232384 unmapped: 87392256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520560640 unmapped: 88064000 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5477022 data_alloc: 234881024 data_used: 19668992
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520560640 unmapped: 88064000 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520560640 unmapped: 88064000 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19909c000/0x0/0x1bfc00000, data 0x31f3ebb/0x3422000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520560640 unmapped: 88064000 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520560640 unmapped: 88064000 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.448692322s of 15.031543732s, submitted: 110
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b711400 session 0x562f8c03b860
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3000 session 0x562f8900eb40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513196032 unmapped: 95428608 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3000 session 0x562f8e186f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5248945 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 95412224 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a295000/0x0/0x1bfc00000, data 0x1e48e26/0x2074000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 95412224 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513212416 unmapped: 95412224 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a295000/0x0/0x1bfc00000, data 0x1e48e26/0x2074000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5248945 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a295000/0x0/0x1bfc00000, data 0x1e48e26/0x2074000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5248945 data_alloc: 218103808 data_used: 9232384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513220608 unmapped: 95404032 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.371462822s of 11.798146248s, submitted: 65
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bfc00 session 0x562f8f7f25a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57000 session 0x562f8c3e50e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513228800 unmapped: 95395840 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d91400 session 0x562f89687680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511238144 unmapped: 97386496 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511246336 unmapped: 97378304 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5069846 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511254528 unmapped: 97370112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511254528 unmapped: 97370112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511254528 unmapped: 97370112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511262720 unmapped: 97361920 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.895744324s of 33.712432861s, submitted: 43
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976400 session 0x562f8c03a1e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511459328 unmapped: 97165312 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5103274 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b258000/0x0/0x1bfc00000, data 0x103be16/0x1266000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b258000/0x0/0x1bfc00000, data 0x103be16/0x1266000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8a592d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b258000/0x0/0x1bfc00000, data 0x103be16/0x1266000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5106634 data_alloc: 218103808 data_used: 3440640
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b258000/0x0/0x1bfc00000, data 0x103be16/0x1266000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5117994 data_alloc: 218103808 data_used: 5054464
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 73K writes, 294K keys, 73K commit groups, 1.0 writes per commit group, ingest: 0.30 GB, 0.06 MB/s#012Cumulative WAL: 73K writes, 27K syncs, 2.70 writes per sync, written: 0.30 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4424 writes, 18K keys, 4424 commit groups, 1.0 writes per commit group, ingest: 21.42 MB, 0.04 MB/s#012Interval WAL: 4424 writes, 1655 syncs, 2.67 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5117994 data_alloc: 218103808 data_used: 5054464
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b258000/0x0/0x1bfc00000, data 0x103be16/0x1266000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 511467520 unmapped: 97157120 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.769996643s of 16.804569244s, submitted: 15
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514301952 unmapped: 94322688 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514056192 unmapped: 94568448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 94298112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 94298112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a55e000/0x0/0x1bfc00000, data 0x1d34e16/0x1f5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5237896 data_alloc: 218103808 data_used: 5861376
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514326528 unmapped: 94298112 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: mgrc ms_handle_reset ms_handle_reset con 0x562f89c57c00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2542147622
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2542147622,v1:192.168.122.100:6801/2542147622]
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: mgrc handle_mgr_configure stats_period=5
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a55e000/0x0/0x1bfc00000, data 0x1d34e16/0x1f5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a53d000/0x0/0x1bfc00000, data 0x1d56e16/0x1f81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5236048 data_alloc: 218103808 data_used: 5869568
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.613012314s of 12.861742020s, submitted: 94
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5236268 data_alloc: 218103808 data_used: 5869568
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a53a000/0x0/0x1bfc00000, data 0x1d59e16/0x1f84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512892928 unmapped: 95731712 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a53a000/0x0/0x1bfc00000, data 0x1d59e16/0x1f84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a53a000/0x0/0x1bfc00000, data 0x1d59e16/0x1f84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5237868 data_alloc: 218103808 data_used: 5955584
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e935c00 session 0x562f8996e000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a52f000/0x0/0x1bfc00000, data 0x1d64e16/0x1f8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8a8a03c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512901120 unmapped: 95723520 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.376052856s of 11.388293266s, submitted: 4
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e935c00 session 0x562f8f7f2b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512925696 unmapped: 95698944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 512942080 unmapped: 95682560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513015808 unmapped: 95608832 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513024000 unmapped: 95600640 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513032192 unmapped: 95592448 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5082382 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513040384 unmapped: 95584256 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e22c00 session 0x562f8a374d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89a4f400 session 0x562f8a374f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8ed2b860
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f89c66b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.095418930s of 31.685581207s, submitted: 229
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513138688 unmapped: 95485952 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e935c00 session 0x562f8be88000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e22c00 session 0x562f8d286b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2f000 session 0x562f8a390780
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2f000 session 0x562f8bb38960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8e186b40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad87000/0x0/0x1bfc00000, data 0x150ae4f/0x1737000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513114112 unmapped: 95510528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513114112 unmapped: 95510528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513114112 unmapped: 95510528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5151021 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513114112 unmapped: 95510528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f92c2b400 session 0x562f8a592960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad87000/0x0/0x1bfc00000, data 0x150ae88/0x1737000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513114112 unmapped: 95510528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3000 session 0x562f8a353860
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5800 session 0x562f89c04d20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513105920 unmapped: 95518720 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3000 session 0x562f8b6f6f00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513122304 unmapped: 95502336 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad85000/0x0/0x1bfc00000, data 0x150aebb/0x1739000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5207581 data_alloc: 218103808 data_used: 10321920
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad85000/0x0/0x1bfc00000, data 0x150aebb/0x1739000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5207581 data_alloc: 218103808 data_used: 10321920
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad85000/0x0/0x1bfc00000, data 0x150aebb/0x1739000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513097728 unmapped: 95526912 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad85000/0x0/0x1bfc00000, data 0x150aebb/0x1739000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5207581 data_alloc: 218103808 data_used: 10321920
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.558675766s of 18.995031357s, submitted: 46
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 513105920 unmapped: 95518720 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 515162112 unmapped: 93462528 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a9b1000/0x0/0x1bfc00000, data 0x18d6ebb/0x1b05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5247011 data_alloc: 218103808 data_used: 10493952
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a9b1000/0x0/0x1bfc00000, data 0x18d6ebb/0x1b05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a9b1000/0x0/0x1bfc00000, data 0x18d6ebb/0x1b05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5247027 data_alloc: 218103808 data_used: 10493952
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.547131538s of 10.736725807s, submitted: 68
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2f000 session 0x562f8d2874a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f89d1a5a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514457600 unmapped: 94167040 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1be000/0x0/0x1bfc00000, data 0xcd2e49/0xeff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,1])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514457600 unmapped: 94167040 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4400 session 0x562f89af3680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514465792 unmapped: 94158848 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514465792 unmapped: 94158848 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514465792 unmapped: 94158848 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514473984 unmapped: 94150656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514482176 unmapped: 94142464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514490368 unmapped: 94134272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514490368 unmapped: 94134272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514490368 unmapped: 94134272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514490368 unmapped: 94134272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514498560 unmapped: 94126080 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5095068 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514498560 unmapped: 94126080 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514498560 unmapped: 94126080 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514498560 unmapped: 94126080 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.954259872s of 31.842222214s, submitted: 46
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1bf000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70f000 session 0x562f8c564960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514834432 unmapped: 93790208 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514834432 unmapped: 93790208 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5114140 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514842624 unmapped: 93782016 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514842624 unmapped: 93782016 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b480000/0x0/0x1bfc00000, data 0xe13e16/0x103e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514842624 unmapped: 93782016 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514842624 unmapped: 93782016 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f92c2b400 session 0x562f899621e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514990080 unmapped: 93634560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5118416 data_alloc: 218103808 data_used: 3129344
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b45c000/0x0/0x1bfc00000, data 0xe37e16/0x1062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b45c000/0x0/0x1bfc00000, data 0xe37e16/0x1062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5124016 data_alloc: 218103808 data_used: 3858432
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514899968 unmapped: 93724672 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514908160 unmapped: 93716480 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b45c000/0x0/0x1bfc00000, data 0xe37e16/0x1062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5124016 data_alloc: 218103808 data_used: 3858432
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 514908160 unmapped: 93716480 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.417503357s of 18.450187683s, submitted: 15
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518537216 unmapped: 90087424 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520077312 unmapped: 88547328 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 88530944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 88530944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5234242 data_alloc: 218103808 data_used: 5107712
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 88530944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a89b000/0x0/0x1bfc00000, data 0x19f0e16/0x1c1b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 88530944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520093696 unmapped: 88530944 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5227526 data_alloc: 218103808 data_used: 5107712
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a882000/0x0/0x1bfc00000, data 0x1a11e16/0x1c3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a882000/0x0/0x1bfc00000, data 0x1a11e16/0x1c3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519913472 unmapped: 88711168 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.553206444s of 13.942697525s, submitted: 128
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5227594 data_alloc: 218103808 data_used: 5107712
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a879000/0x0/0x1bfc00000, data 0x1a1ae16/0x1c45000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5227594 data_alloc: 218103808 data_used: 5107712
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a879000/0x0/0x1bfc00000, data 0x1a1ae16/0x1c45000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2dc00 session 0x562f8e1863c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41c400 session 0x562f8e186000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c7d3c00 session 0x562f8c3bda40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976000 session 0x562f8c3bd2c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519864320 unmapped: 88760320 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2dc00 session 0x562f8996ed20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41c400 session 0x562f8c03a000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c7d3c00 session 0x562f8c03af00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f92c2b400 session 0x562f8a5932c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976000 session 0x562f8967cd20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519938048 unmapped: 88686592 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519938048 unmapped: 88686592 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a16a000/0x0/0x1bfc00000, data 0x2127e88/0x2354000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519938048 unmapped: 88686592 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5292327 data_alloc: 218103808 data_used: 5107712
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519946240 unmapped: 88678400 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bfc00 session 0x562f89c452c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519946240 unmapped: 88678400 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963cb800 session 0x562f89b71680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519946240 unmapped: 88678400 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8b695c20
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a16a000/0x0/0x1bfc00000, data 0x2127e88/0x2354000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.305614471s of 12.882215500s, submitted: 41
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963cb000 session 0x562f8c5652c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519897088 unmapped: 88727552 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 519905280 unmapped: 88719360 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5334501 data_alloc: 218103808 data_used: 11075584
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a169000/0x0/0x1bfc00000, data 0x2127e98/0x2355000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a169000/0x0/0x1bfc00000, data 0x2127e98/0x2355000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a169000/0x0/0x1bfc00000, data 0x2127e98/0x2355000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a167000/0x0/0x1bfc00000, data 0x2128e98/0x2356000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5335337 data_alloc: 218103808 data_used: 11075584
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a167000/0x0/0x1bfc00000, data 0x2128e98/0x2356000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5335817 data_alloc: 218103808 data_used: 11087872
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a167000/0x0/0x1bfc00000, data 0x2128e98/0x2356000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520626176 unmapped: 87998464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.940837860s of 12.958176613s, submitted: 7
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 522444800 unmapped: 86179840 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19975a000/0x0/0x1bfc00000, data 0x2b2de98/0x2d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5425901 data_alloc: 218103808 data_used: 11247616
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523837440 unmapped: 84787200 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19973f000/0x0/0x1bfc00000, data 0x2b51e98/0x2d7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,1])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8d290c00 session 0x562f8a5590e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c168400 session 0x562f8e187680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523173888 unmapped: 85450752 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19973f000/0x0/0x1bfc00000, data 0x2b51e98/0x2d7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [3])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bfc00 session 0x562f8f7f23c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523182080 unmapped: 85442560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5243934 data_alloc: 218103808 data_used: 5095424
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523182080 unmapped: 85442560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523182080 unmapped: 85442560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523182080 unmapped: 85442560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 523182080 unmapped: 85442560 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199e8e000/0x0/0x1bfc00000, data 0x1a1be16/0x1c46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f89d1a780
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a41d000 session 0x562f8c3e4960
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.983770370s of 13.347075462s, submitted: 148
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abbc00 session 0x562f89c67680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5120644 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5120644 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5120644 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5120644 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518569984 unmapped: 90054656 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518578176 unmapped: 90046464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518578176 unmapped: 90046464 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5120644 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518586368 unmapped: 90038272 heap: 608624640 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.599365234s of 24.624221802s, submitted: 15
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a89e400 session 0x562f8bb39e00
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e20000 session 0x562f89d1b2c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2dc00 session 0x562f8c03a5a0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f9456d400 session 0x562f8be881e0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cc000 session 0x562f8a592000
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a642000/0x0/0x1bfc00000, data 0x1c50e78/0x1e7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240176 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a642000/0x0/0x1bfc00000, data 0x1c50e78/0x1e7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5240176 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518619136 unmapped: 94208000 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f89b4f680
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518635520 unmapped: 94191616 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518643712 unmapped: 94183424 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5347961 data_alloc: 234881024 data_used: 18034688
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5347961 data_alloc: 234881024 data_used: 18034688
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a641000/0x0/0x1bfc00000, data 0x1c50e9b/0x1e7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 518668288 unmapped: 94158848 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.098247528s of 21.288000107s, submitted: 53
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5458831 data_alloc: 234881024 data_used: 18767872
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521428992 unmapped: 91398144 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998ba000/0x0/0x1bfc00000, data 0x29cee9b/0x2bfb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5478087 data_alloc: 234881024 data_used: 19460096
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5472583 data_alloc: 234881024 data_used: 19472384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5472583 data_alloc: 234881024 data_used: 19472384
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5473223 data_alloc: 234881024 data_used: 19488768
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998c0000/0x0/0x1bfc00000, data 0x29d1e9b/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5473223 data_alloc: 234881024 data_used: 19488768
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.534927368s of 25.809513092s, submitted: 125
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 520060928 unmapped: 92766208 heap: 612827136 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710c00 session 0x562f89c672c0
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3800 session 0x562f8a375a40
Jan 20 10:53:17 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710400 session 0x562f8bb381e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a56e800 session 0x562f8b695860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89ad3800 session 0x562f8967d680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8be88000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710400 session 0x562f8928c5a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710c00 session 0x562f8f7f25a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70c400 session 0x562f8ed2af00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998be000/0x0/0x1bfc00000, data 0x29d3e9b/0x2c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5592423 data_alloc: 234881024 data_used: 19488768
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1989eb000/0x0/0x1bfc00000, data 0x38a5eab/0x3ad3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ca000 session 0x562f8c235a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1989ea000/0x0/0x1bfc00000, data 0x38a5ece/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 521314304 unmapped: 95191040 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5656764 data_alloc: 234881024 data_used: 27885568
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.679993629s of 13.823144913s, submitted: 29
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1989ea000/0x0/0x1bfc00000, data 0x38a5ece/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5699324 data_alloc: 251658240 data_used: 32649216
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1989ea000/0x0/0x1bfc00000, data 0x38a5ece/0x3ad4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5699584 data_alloc: 251658240 data_used: 32665600
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527032320 unmapped: 89473024 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1989e9000/0x0/0x1bfc00000, data 0x38a6ece/0x3ad5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,2])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533118976 unmapped: 83386368 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533225472 unmapped: 83279872 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b89000/0x0/0x1bfc00000, data 0x46feece/0x492d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533250048 unmapped: 83255296 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533250048 unmapped: 83255296 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5825714 data_alloc: 251658240 data_used: 34033664
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b7a000/0x0/0x1bfc00000, data 0x470dece/0x493c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533250048 unmapped: 83255296 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.970212936s of 14.434096336s, submitted: 115
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5825730 data_alloc: 251658240 data_used: 34033664
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b7a000/0x0/0x1bfc00000, data 0x470dece/0x493c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 533258240 unmapped: 83247104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532586496 unmapped: 83918848 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b81000/0x0/0x1bfc00000, data 0x470eece/0x493d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b81000/0x0/0x1bfc00000, data 0x470eece/0x493d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532586496 unmapped: 83918848 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197b81000/0x0/0x1bfc00000, data 0x470eece/0x493d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532586496 unmapped: 83918848 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5815078 data_alloc: 251658240 data_used: 34033664
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532594688 unmapped: 83910656 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532594688 unmapped: 83910656 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f8bc08b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70f000 session 0x562f8996ed20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994ef000/0x0/0x1bfc00000, data 0x29d5e9b/0x2c02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994ef000/0x0/0x1bfc00000, data 0x29d5e9b/0x2c02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5487174 data_alloc: 234881024 data_used: 17276928
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994ef000/0x0/0x1bfc00000, data 0x29d5e9b/0x2c02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.652659416s of 13.726318359s, submitted: 42
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532234240 unmapped: 84271104 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 84254720 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 84254720 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5484918 data_alloc: 234881024 data_used: 17276928
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 84254720 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 532250624 unmapped: 84254720 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998ba000/0x0/0x1bfc00000, data 0x29d5e9b/0x2c02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,2])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c7d2000 session 0x562f8b7c2d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bf800 session 0x562f8c3bd2c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998bc000/0x0/0x1bfc00000, data 0x29d5e9b/0x2c02000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 88866816 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 88858624 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 88850432 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5149518 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 88842240 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 88834048 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 88834048 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc9000 session 0x562f8a3de3c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc8800 session 0x562f8967cb40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8a3de1e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89a4e800 session 0x562f8e187a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.192298889s of 41.324481964s, submitted: 54
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527687680 unmapped: 88817664 heap: 616505344 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b215000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [0,0,0,0,0,1])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cc400 session 0x562f8ed2a1e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89a4e800 session 0x562f8ed2b860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cc400 session 0x562f8bc09e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abd800 session 0x562f8c3bcb40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc8800 session 0x562f8c3e4000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527704064 unmapped: 98295808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5274847 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527704064 unmapped: 98295808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a569000/0x0/0x1bfc00000, data 0x1d29e26/0x1f55000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527704064 unmapped: 98295808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527704064 unmapped: 98295808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527704064 unmapped: 98295808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527712256 unmapped: 98287616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cd800 session 0x562f8c03a000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5277845 data_alloc: 218103808 data_used: 2924544
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527712256 unmapped: 98287616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527712256 unmapped: 98287616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527712256 unmapped: 98287616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5398805 data_alloc: 234881024 data_used: 20045824
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a568000/0x0/0x1bfc00000, data 0x1d29e49/0x1f56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.986478806s of 16.756948471s, submitted: 37
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89a4e800 session 0x562f8c202f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cc400 session 0x562f8c3bc000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5398541 data_alloc: 234881024 data_used: 20045824
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 531300352 unmapped: 94699520 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526696448 unmapped: 99303424 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8df52400 session 0x562f89c05a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5160055 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5160055 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5160055 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b5c0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526704640 unmapped: 99295232 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.553043365s of 18.949422836s, submitted: 49
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529850368 unmapped: 96149504 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e472c00 session 0x562f8bb39e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a89e400 session 0x562f8996ed20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89a4e800 session 0x562f8a559860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5184994 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b2bb000/0x0/0x1bfc00000, data 0xfd8e16/0x1203000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526721024 unmapped: 99278848 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cc400 session 0x562f8bc08f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8df52400 session 0x562f896861e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526721024 unmapped: 99278848 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e472c00 session 0x562f8f7f2b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b2bb000/0x0/0x1bfc00000, data 0xfd8e16/0x1203000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2373f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898be000 session 0x562f8c3e5a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526729216 unmapped: 99270656 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526729216 unmapped: 99270656 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976c00 session 0x562f8c565a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3cac00 session 0x562f8a374f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526729216 unmapped: 99270656 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ae85000/0x0/0x1bfc00000, data 0xffce49/0x1229000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5193594 data_alloc: 218103808 data_used: 2945024
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526745600 unmapped: 99254272 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526745600 unmapped: 99254272 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ae85000/0x0/0x1bfc00000, data 0xffce49/0x1229000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e937c00 session 0x562f8e187e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e23800 session 0x562f89b4e1e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526745600 unmapped: 99254272 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ae85000/0x0/0x1bfc00000, data 0xffce49/0x1229000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4400 session 0x562f8e187a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526786560 unmapped: 99213312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526794752 unmapped: 99205120 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526802944 unmapped: 99196928 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526802944 unmapped: 99196928 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526802944 unmapped: 99196928 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526802944 unmapped: 99196928 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526802944 unmapped: 99196928 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526811136 unmapped: 99188736 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526819328 unmapped: 99180544 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5166599 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526819328 unmapped: 99180544 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1b0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526827520 unmapped: 99172352 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 52.271442413s of 53.535400391s, submitted: 73
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 526827520 unmapped: 99172352 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f89c67a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f92c2a000 session 0x562f89c452c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527056896 unmapped: 98942976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2d000/0x0/0x1bfc00000, data 0x1155e78/0x1381000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527065088 unmapped: 98934784 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5210032 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527065088 unmapped: 98934784 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527065088 unmapped: 98934784 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527065088 unmapped: 98934784 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d90400 session 0x562f89b71860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527065088 unmapped: 98934784 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2d000/0x0/0x1bfc00000, data 0x1155e78/0x1381000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f89b705a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d5800 session 0x562f8b6f65a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8b6f72c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 97951744 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212179 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 97951744 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 97951744 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2b000/0x0/0x1bfc00000, data 0x1155eab/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2b000/0x0/0x1bfc00000, data 0x1155eab/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5243539 data_alloc: 218103808 data_used: 7344128
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2b000/0x0/0x1bfc00000, data 0x1155eab/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19ad2b000/0x0/0x1bfc00000, data 0x1155eab/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5243539 data_alloc: 218103808 data_used: 7344128
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 528064512 unmapped: 97935360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.965538025s of 20.741956711s, submitted: 37
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529907712 unmapped: 96092160 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 530014208 unmapped: 95985664 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303259 data_alloc: 218103808 data_used: 8224768
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a7f8000/0x0/0x1bfc00000, data 0x1680eab/0x18ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303739 data_alloc: 218103808 data_used: 8237056
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529358848 unmapped: 96641024 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a763000/0x0/0x1bfc00000, data 0x171deab/0x194b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 96673792 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 96673792 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529334272 unmapped: 96665600 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529334272 unmapped: 96665600 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a763000/0x0/0x1bfc00000, data 0x171deab/0x194b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303999 data_alloc: 218103808 data_used: 8273920
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529334272 unmapped: 96665600 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a763000/0x0/0x1bfc00000, data 0x171deab/0x194b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.305525780s of 12.984498024s, submitted: 79
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 96690176 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 96690176 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d90400 session 0x562f8f7f3860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8bc094a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 96681984 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6400 session 0x562f8a5932c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527622144 unmapped: 98377728 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527630336 unmapped: 98369536 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 98361344 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527638528 unmapped: 98361344 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527646720 unmapped: 98353152 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 98344960 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527654912 unmapped: 98344960 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527663104 unmapped: 98336768 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 98328576 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 98328576 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19b1af000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5176354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 98328576 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 98328576 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527671296 unmapped: 98328576 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e937c00 session 0x562f8c203a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6400 session 0x562f89d1bc20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8c564000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8bc08b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.486190796s of 32.563770294s, submitted: 31
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527622144 unmapped: 98377728 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d90400 session 0x562f8c235a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f95e21c00 session 0x562f8967da40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6400 session 0x562f8c3e4b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8c235a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8bc08b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527753216 unmapped: 98246656 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5279354 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527753216 unmapped: 98246656 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a50b000/0x0/0x1bfc00000, data 0x1977e26/0x1ba3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527761408 unmapped: 98238464 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527761408 unmapped: 98238464 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90d90400 session 0x562f8c564000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527761408 unmapped: 98238464 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e936c00 session 0x562f89d1bc20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527761408 unmapped: 98238464 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6400 session 0x562f8c203a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8a5932c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5281448 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527712256 unmapped: 98287616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a50a000/0x0/0x1bfc00000, data 0x1977e49/0x1ba4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527564800 unmapped: 98435072 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527564800 unmapped: 98435072 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527564800 unmapped: 98435072 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527564800 unmapped: 98435072 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a50a000/0x0/0x1bfc00000, data 0x1977e49/0x1ba4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5374220 data_alloc: 234881024 data_used: 15802368
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19a50a000/0x0/0x1bfc00000, data 0x1977e49/0x1ba4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23b4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5374220 data_alloc: 234881024 data_used: 15802368
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f8a8a03c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 527572992 unmapped: 98426880 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f9456d800 session 0x562f8c203860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90b99400 session 0x562f8c2030e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f8bb385a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.176801682s of 18.334997177s, submitted: 31
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6400 session 0x562f8d287c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe800 session 0x562f8b694960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f9456d800 session 0x562f89b4f860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2c000 session 0x562f8b694f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f8be88f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534405120 unmapped: 91594752 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19892a000/0x0/0x1bfc00000, data 0x33f7e49/0x3624000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [3])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5618217 data_alloc: 234881024 data_used: 17883136
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198870000/0x0/0x1bfc00000, data 0x34a9e49/0x36d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70cc00 session 0x562f8be88b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963cb000 session 0x562f8c03a960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534052864 unmapped: 91947008 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b7a8000 session 0x562f8c202000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198870000/0x0/0x1bfc00000, data 0x34a9e49/0x36d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8400 session 0x562f8f7f34a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534208512 unmapped: 91791360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5624997 data_alloc: 234881024 data_used: 18481152
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 534216704 unmapped: 91783168 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19882e000/0x0/0x1bfc00000, data 0x34f1e7c/0x3720000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19882e000/0x0/0x1bfc00000, data 0x34f1e7c/0x3720000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5697157 data_alloc: 234881024 data_used: 28712960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19882e000/0x0/0x1bfc00000, data 0x34f1e7c/0x3720000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x23caf9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5697157 data_alloc: 234881024 data_used: 28712960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.579536438s of 19.038967133s, submitted: 210
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538124288 unmapped: 87875584 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545955840 unmapped: 80044032 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19693e000/0x0/0x1bfc00000, data 0x4238e7c/0x4467000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5829515 data_alloc: 251658240 data_used: 30580736
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19690f000/0x0/0x1bfc00000, data 0x4267e7c/0x4496000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 79757312 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19690f000/0x0/0x1bfc00000, data 0x4267e7c/0x4496000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819467 data_alloc: 251658240 data_used: 30658560
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.536319733s of 12.800119400s, submitted: 123
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x196915000/0x0/0x1bfc00000, data 0x426ae7c/0x4499000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,5])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5820259 data_alloc: 251658240 data_used: 30658560
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545570816 unmapped: 80429056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e800 session 0x562f8c3e4000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70cc00 session 0x562f8c5645a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8b7c3c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19690f000/0x0/0x1bfc00000, data 0x4270e7c/0x449f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 84451328 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 84451328 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 84451328 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5532308 data_alloc: 234881024 data_used: 17891328
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 84451328 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 84451328 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f89b705a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e936c00 session 0x562f89c452c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19823e000/0x0/0x1bfc00000, data 0x2942e49/0x2b6f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [1])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8c203c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 76K writes, 307K keys, 76K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.05 MB/s#012Cumulative WAL: 76K writes, 28K syncs, 2.69 writes per sync, written: 0.31 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3357 writes, 13K keys, 3357 commit groups, 1.0 writes per commit group, ingest: 15.59 MB, 0.03 MB/s#012Interval WAL: 3357 writes, 1321 syncs, 2.54 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562f882274b0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539312128 unmapped: 86687744 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5212114 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 61.782196045s of 62.848583221s, submitted: 107
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,1])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70d800 session 0x562f8e186b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539344896 unmapped: 86654976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539344896 unmapped: 86654976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539344896 unmapped: 86654976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5272046 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539344896 unmapped: 86654976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbff000 session 0x562f89962f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6000 session 0x562f8c565860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539344896 unmapped: 86654976 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70c800 session 0x562f8be89860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8a5592c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539287552 unmapped: 86712320 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539295744 unmapped: 86704128 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322697 data_alloc: 218103808 data_used: 9908224
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539303936 unmapped: 86695936 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.000275612s of 12.390037537s, submitted: 24
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539320320 unmapped: 86679552 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322697 data_alloc: 218103808 data_used: 9908224
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539328512 unmapped: 86671360 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539426816 unmapped: 86573056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539426816 unmapped: 86573056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539426816 unmapped: 86573056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1997bd000/0x0/0x1bfc00000, data 0x13c6e16/0x15f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539426816 unmapped: 86573056 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5355903 data_alloc: 218103808 data_used: 9920512
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 81330176 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542212096 unmapped: 83787776 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c72000/0x0/0x1bfc00000, data 0x1f11e16/0x213c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5419215 data_alloc: 218103808 data_used: 11083776
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.413876534s of 14.327191353s, submitted: 328
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c4e000/0x0/0x1bfc00000, data 0x1f35e16/0x2160000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5417283 data_alloc: 218103808 data_used: 11083776
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542310400 unmapped: 83689472 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f89c66b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6000 session 0x562f89c67680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe400 session 0x562f8bc08d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 20 10:53:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2298780932' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538189824 unmapped: 87810048 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538198016 unmapped: 87801856 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538198016 unmapped: 87801856 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538198016 unmapped: 87801856 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538198016 unmapped: 87801856 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538198016 unmapped: 87801856 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538206208 unmapped: 87793664 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5221794 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538206208 unmapped: 87793664 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538206208 unmapped: 87793664 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97975400 session 0x562f8a3df0e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f89c66b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f8a5592c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199eb1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6000 session 0x562f8c565860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.662048340s of 39.759681702s, submitted: 30
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbfe400 session 0x562f89962f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97975400 session 0x562f8b7c3c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8c5645a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f8f7f34a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6000 session 0x562f8c202000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5259828 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd9000/0x0/0x1bfc00000, data 0xfa9e26/0x11d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 87547904 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd9000/0x0/0x1bfc00000, data 0xfa9e26/0x11d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5259828 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d5000 session 0x562f8be88b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538460160 unmapped: 87539712 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a56fc00 session 0x562f8be88f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538460160 unmapped: 87539712 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8b694f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a56fc00 session 0x562f89b4f860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538460160 unmapped: 87539712 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.156903267s of 10.520124435s, submitted: 31
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538460160 unmapped: 87539712 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 87613440 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd8000/0x0/0x1bfc00000, data 0xfa9e49/0x11d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5273686 data_alloc: 218103808 data_used: 4395008
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd8000/0x0/0x1bfc00000, data 0xfa9e49/0x11d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd8000/0x0/0x1bfc00000, data 0xfa9e49/0x11d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5273686 data_alloc: 218103808 data_used: 4395008
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 87605248 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199bd8000/0x0/0x1bfc00000, data 0xfa9e49/0x11d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5273686 data_alloc: 218103808 data_used: 4395008
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.699173927s of 12.701519966s, submitted: 1
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 538861568 unmapped: 87138304 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 85155840 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199155000/0x0/0x1bfc00000, data 0x1a15e49/0x1c42000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5371084 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 85008384 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19914b000/0x0/0x1bfc00000, data 0x1a36e49/0x1c63000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5359504 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539983872 unmapped: 86016000 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539983872 unmapped: 86016000 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19914b000/0x0/0x1bfc00000, data 0x1a36e49/0x1c63000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539983872 unmapped: 86016000 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.561217308s of 13.832423210s, submitted: 127
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5359560 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199145000/0x0/0x1bfc00000, data 0x1a3ce49/0x1c69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199145000/0x0/0x1bfc00000, data 0x1a3ce49/0x1c69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5359636 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 539992064 unmapped: 86007808 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199142000/0x0/0x1bfc00000, data 0x1a3fe49/0x1c6c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540000256 unmapped: 85999616 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5359636 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199142000/0x0/0x1bfc00000, data 0x1a3fe49/0x1c6c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540008448 unmapped: 85991424 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199142000/0x0/0x1bfc00000, data 0x1a3fe49/0x1c6c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8f860c00 session 0x562f8a374f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f89c445a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f92c2a000 session 0x562f89c67e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8a593860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540008448 unmapped: 85991424 heap: 625999872 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.154982567s of 12.480561256s, submitted: 4
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8b7a41e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a56fc00 session 0x562f8b6f61e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8f860c00 session 0x562f8c203e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a301400 session 0x562f8c234960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2d800 session 0x562f8b695860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199142000/0x0/0x1bfc00000, data 0x1a3fe49/0x1c6c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5480873 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198291000/0x0/0x1bfc00000, data 0x28eeeba/0x2b1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 93364224 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5479353 data_alloc: 218103808 data_used: 5799936
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ca800 session 0x562f8c565c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97974400 session 0x562f8c5641e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3c8400 session 0x562f8bb39a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.000612259s of 10.176171303s, submitted: 45
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97977800 session 0x562f8e186f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198291000/0x0/0x1bfc00000, data 0x28eeeba/0x2b1d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 93356032 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541253632 unmapped: 93143040 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198290000/0x0/0x1bfc00000, data 0x28eeedd/0x2b1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581918 data_alloc: 234881024 data_used: 20062208
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198290000/0x0/0x1bfc00000, data 0x28eeedd/0x2b1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5581918 data_alloc: 234881024 data_used: 20062208
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 92618752 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198290000/0x0/0x1bfc00000, data 0x28eeedd/0x2b1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.377536774s of 11.390765190s, submitted: 5
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541794304 unmapped: 92602368 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198290000/0x0/0x1bfc00000, data 0x28eeedd/0x2b1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,12])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198290000/0x0/0x1bfc00000, data 0x28eeedd/0x2b1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541802496 unmapped: 92594176 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5585774 data_alloc: 234881024 data_used: 20090880
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543850496 unmapped: 90546176 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1976dd000/0x0/0x1bfc00000, data 0x349bedd/0x36cb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [0,0,0,1])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545021952 unmapped: 89374720 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19769b000/0x0/0x1bfc00000, data 0x34d4edd/0x3704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5685084 data_alloc: 234881024 data_used: 20451328
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19769b000/0x0/0x1bfc00000, data 0x34d4edd/0x3704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5678588 data_alloc: 234881024 data_used: 20455424
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1976a7000/0x0/0x1bfc00000, data 0x34d7edd/0x3707000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546127872 unmapped: 88268800 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546136064 unmapped: 88260608 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5678748 data_alloc: 234881024 data_used: 20459520
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546136064 unmapped: 88260608 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1976a7000/0x0/0x1bfc00000, data 0x34d7edd/0x3707000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546144256 unmapped: 88252416 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.632635117s of 18.787807465s, submitted: 141
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1976a7000/0x0/0x1bfc00000, data 0x34d7edd/0x3707000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546144256 unmapped: 88252416 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8b6f6f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1976a6000/0x0/0x1bfc00000, data 0x34d8edd/0x3708000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542359552 unmapped: 92037120 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97976800 session 0x562f89bf92c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542359552 unmapped: 92037120 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5376585 data_alloc: 218103808 data_used: 5791744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542359552 unmapped: 92037120 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542359552 unmapped: 92037120 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542359552 unmapped: 92037120 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19913b000/0x0/0x1bfc00000, data 0x1a45e49/0x1c72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5376585 data_alloc: 218103808 data_used: 5791744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19913b000/0x0/0x1bfc00000, data 0x1a45e49/0x1c72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x24e4f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 92028928 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f8a8a03c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.260348320s of 13.347884178s, submitted: 39
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a6000 session 0x562f8a5932c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5376453 data_alloc: 218103808 data_used: 5791744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c7d2c00 session 0x562f8928c780
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5251031 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542392320 unmapped: 92004352 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5251031 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5251031 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542400512 unmapped: 91996160 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542408704 unmapped: 91987968 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542408704 unmapped: 91987968 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542408704 unmapped: 91987968 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5251031 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542408704 unmapped: 91987968 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542408704 unmapped: 91987968 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542416896 unmapped: 91979776 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542416896 unmapped: 91979776 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542425088 unmapped: 91971584 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5251031 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542425088 unmapped: 91971584 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57c00 session 0x562f89c445a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8a374f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542425088 unmapped: 91971584 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a5b8800 session 0x562f8b694f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8be88f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.689945221s of 26.771379471s, submitted: 37
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b710800 session 0x562f8be88b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57c00 session 0x562f89962f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8c565860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8a5592c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a5b8800 session 0x562f89c66b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303049 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542613504 unmapped: 91783168 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5303049 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5400 session 0x562f8a3df0e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57c00 session 0x562f8bc08d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f89c67680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.614043236s of 10.666964531s, submitted: 15
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8b6f6f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542621696 unmapped: 91774976 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5344742 data_alloc: 218103808 data_used: 8544256
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5344742 data_alloc: 218103808 data_used: 8544256
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5344742 data_alloc: 218103808 data_used: 8544256
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994fd000/0x0/0x1bfc00000, data 0x1275e26/0x14a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543219712 unmapped: 91176960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.321745872s of 13.344098091s, submitted: 7
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545284096 unmapped: 89112576 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545611776 unmapped: 88784896 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c43000/0x0/0x1bfc00000, data 0x1b27e26/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545611776 unmapped: 88784896 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545611776 unmapped: 88784896 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c43000/0x0/0x1bfc00000, data 0x1b27e26/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427056 data_alloc: 218103808 data_used: 9846784
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545611776 unmapped: 88784896 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c2c000/0x0/0x1bfc00000, data 0x1b46e26/0x1d72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c2c000/0x0/0x1bfc00000, data 0x1b46e26/0x1d72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5420732 data_alloc: 218103808 data_used: 9846784
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.572469711s of 12.788298607s, submitted: 96
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c22000/0x0/0x1bfc00000, data 0x1b50e26/0x1d7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5420980 data_alloc: 218103808 data_used: 9846784
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c22000/0x0/0x1bfc00000, data 0x1b50e26/0x1d7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5421548 data_alloc: 218103808 data_used: 9846784
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545619968 unmapped: 88776704 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545824768 unmapped: 88571904 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc9000 session 0x562f8e187e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8fa35800 session 0x562f8c564780
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57c00 session 0x562f8f7f23c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8a3754a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8a390b40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c1c000/0x0/0x1bfc00000, data 0x1b56e26/0x1d82000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545792000 unmapped: 88604672 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545800192 unmapped: 88596480 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198644000/0x0/0x1bfc00000, data 0x212ee26/0x235a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545800192 unmapped: 88596480 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70e000 session 0x562f8f7f21e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5469447 data_alloc: 218103808 data_used: 9846784
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545800192 unmapped: 88596480 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97974c00 session 0x562f8ed2af00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc8800 session 0x562f8c5641e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.948197365s of 13.037006378s, submitted: 22
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89c57c00 session 0x562f8c202000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545947648 unmapped: 88449024 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545947648 unmapped: 88449024 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198620000/0x0/0x1bfc00000, data 0x2152e26/0x237e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198620000/0x0/0x1bfc00000, data 0x2152e26/0x237e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5514367 data_alloc: 234881024 data_used: 15777792
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198620000/0x0/0x1bfc00000, data 0x2152e26/0x237e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5515231 data_alloc: 234881024 data_used: 15781888
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198620000/0x0/0x1bfc00000, data 0x2152e26/0x237e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 545161216 unmapped: 89235456 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.393094063s of 12.421332359s, submitted: 9
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 548339712 unmapped: 86056960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5599347 data_alloc: 234881024 data_used: 16134144
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 548339712 unmapped: 86056960 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197ae9000/0x0/0x1bfc00000, data 0x2c89e26/0x2eb5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197ae6000/0x0/0x1bfc00000, data 0x2c8ce26/0x2eb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5609553 data_alloc: 234881024 data_used: 16125952
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x197ae6000/0x0/0x1bfc00000, data 0x2c8ce26/0x2eb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549396480 unmapped: 85000192 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f89d2e400 session 0x562f8bb39a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a3ccc00 session 0x562f8b7c3c20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d4c00 session 0x562f89b4f860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5433522 data_alloc: 218103808 data_used: 9850880
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198a5f000/0x0/0x1bfc00000, data 0x1b62e26/0x1d8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198a5f000/0x0/0x1bfc00000, data 0x1b62e26/0x1d8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.215131760s of 14.481391907s, submitted: 108
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549412864 unmapped: 84983808 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4c00 session 0x562f8bc09e00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f91448800 session 0x562f8bb394a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963ca800 session 0x562f8c3e5680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549421056 unmapped: 84975616 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549429248 unmapped: 84967424 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549437440 unmapped: 84959232 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549445632 unmapped: 84951040 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549445632 unmapped: 84951040 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5275296 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549445632 unmapped: 84951040 heap: 634396672 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90b98000 session 0x562f89bf8960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a5000 session 0x562f89c04d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4c00 session 0x562f8928c5a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8d290000 session 0x562f8c202f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.458549500s of 27.611721039s, submitted: 31
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1998d0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90b98000 session 0x562f8ed2ad20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f91448800 session 0x562f8bc08960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963ca800 session 0x562f8928c5a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4c00 session 0x562f89c04d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8d290000 session 0x562f8c3e5680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549486592 unmapped: 92782592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549486592 unmapped: 92782592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549486592 unmapped: 92782592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549486592 unmapped: 92782592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5407403 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549486592 unmapped: 92782592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549494784 unmapped: 92774400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549494784 unmapped: 92774400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549494784 unmapped: 92774400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f8800 session 0x562f898c81e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549494784 unmapped: 92774400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5407604 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bbffc00 session 0x562f89c67860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549527552 unmapped: 92741632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.839284897s of 10.062349319s, submitted: 44
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549527552 unmapped: 92741632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 549535744 unmapped: 92733440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5525752 data_alloc: 234881024 data_used: 19234816
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5525752 data_alloc: 234881024 data_used: 19234816
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.764149666s of 11.766985893s, submitted: 1
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198999000/0x0/0x1bfc00000, data 0x1dd8e88/0x2005000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554901504 unmapped: 87367680 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612894 data_alloc: 234881024 data_used: 20729856
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198079000/0x0/0x1bfc00000, data 0x26f7e88/0x2924000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612910 data_alloc: 234881024 data_used: 20729856
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198079000/0x0/0x1bfc00000, data 0x26f7e88/0x2924000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198079000/0x0/0x1bfc00000, data 0x26f7e88/0x2924000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198079000/0x0/0x1bfc00000, data 0x26f7e88/0x2924000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198079000/0x0/0x1bfc00000, data 0x26f7e88/0x2924000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612910 data_alloc: 234881024 data_used: 20729856
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 554246144 unmapped: 88023040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.853179932s of 14.085871696s, submitted: 93
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bb6a800 session 0x562f8bc09680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abc400 session 0x562f8bb39a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4c00 session 0x562f898c8d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5290976 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5290976 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5290976 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5290976 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199761000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d5400 session 0x562f8c234960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abc400 session 0x562f89c45680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bec00 session 0x562f8e186f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70d400 session 0x562f8be885a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.086126328s of 23.166633606s, submitted: 41
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898bec00 session 0x562f8a3743c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a8a4c00 session 0x562f8c3e4780
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8c1d5400 session 0x562f8a592000
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f90abc400 session 0x562f8f7f30e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f94c52400 session 0x562f8e187a40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5334313 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995f9000/0x0/0x1bfc00000, data 0x1179e26/0x13a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995f9000/0x0/0x1bfc00000, data 0x1179e26/0x13a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995f9000/0x0/0x1bfc00000, data 0x1179e26/0x13a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995f9000/0x0/0x1bfc00000, data 0x1179e26/0x13a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5334313 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995f9000/0x0/0x1bfc00000, data 0x1179e26/0x13a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963ca800 session 0x562f8c5641e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995d4000/0x0/0x1bfc00000, data 0x119de49/0x13ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541736960 unmapped: 100532224 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5338470 data_alloc: 218103808 data_used: 2920448
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541736960 unmapped: 100532224 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541736960 unmapped: 100532224 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541900800 unmapped: 100368384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995d4000/0x0/0x1bfc00000, data 0x119de49/0x13ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5372390 data_alloc: 218103808 data_used: 7700480
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995d4000/0x0/0x1bfc00000, data 0x119de49/0x13ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5372390 data_alloc: 218103808 data_used: 7700480
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1995d4000/0x0/0x1bfc00000, data 0x119de49/0x13ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 542367744 unmapped: 99901440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.212116241s of 23.299909592s, submitted: 26
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544325632 unmapped: 97943552 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5397568 data_alloc: 218103808 data_used: 7839744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994a6000/0x0/0x1bfc00000, data 0x12c3e49/0x14f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5397584 data_alloc: 218103808 data_used: 7839744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994a6000/0x0/0x1bfc00000, data 0x12c3e49/0x14f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994a6000/0x0/0x1bfc00000, data 0x12c3e49/0x14f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5397584 data_alloc: 218103808 data_used: 7839744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 543703040 unmapped: 98566144 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f9800 session 0x562f8b6954a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97975400 session 0x562f89b70f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8f860800 session 0x562f8b7a4d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f898be800 session 0x562f89b71860
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.446071625s of 13.547875404s, submitted: 43
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a2f9800 session 0x562f89c66960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544940032 unmapped: 97329152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8f860800 session 0x562f8900eb40
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f963ca800 session 0x562f8a558f00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97975400 session 0x562f89af23c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f97977000 session 0x562f8ed2af00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544956416 unmapped: 97312768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c46000/0x0/0x1bfc00000, data 0x1b2aeab/0x1d58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8904fc00 session 0x562f8c3e4d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5464102 data_alloc: 218103808 data_used: 7839744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8a5b8400 session 0x562f8c234d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c46000/0x0/0x1bfc00000, data 0x1b2aeab/0x1d58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8e934000 session 0x562f8ed2af00
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b70c000 session 0x562f89af23c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 544964608 unmapped: 97304576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c45000/0x0/0x1bfc00000, data 0x1b2aebb/0x1d59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5520053 data_alloc: 234881024 data_used: 15511552
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c45000/0x0/0x1bfc00000, data 0x1b2aebb/0x1d59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c45000/0x0/0x1bfc00000, data 0x1b2aebb/0x1d59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5520053 data_alloc: 234881024 data_used: 15511552
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198c45000/0x0/0x1bfc00000, data 0x1b2aebb/0x1d59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 547971072 unmapped: 94298112 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.229217529s of 18.420061111s, submitted: 56
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5566369 data_alloc: 234881024 data_used: 15765504
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550273024 unmapped: 91996160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 91987968 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198697000/0x0/0x1bfc00000, data 0x20d0ebb/0x22ff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5579593 data_alloc: 234881024 data_used: 15585280
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198697000/0x0/0x1bfc00000, data 0x20d0ebb/0x22ff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198697000/0x0/0x1bfc00000, data 0x20d0ebb/0x22ff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x198697000/0x0/0x1bfc00000, data 0x20d0ebb/0x22ff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x19869c000/0x0/0x1bfc00000, data 0x20d3ebb/0x2302000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 550322176 unmapped: 91947008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b7a9800 session 0x562f8b7a4d20
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8bb2a400 session 0x562f896865a0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546406400 unmapped: 95862784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8904fc00 session 0x562f8bb39680
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.600634575s of 10.286421776s, submitted: 142
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5406880 data_alloc: 218103808 data_used: 7839744
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546406400 unmapped: 95862784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546406400 unmapped: 95862784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546406400 unmapped: 95862784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990e2000/0x0/0x1bfc00000, data 0x12c3e49/0x14f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546406400 unmapped: 95862784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990e2000/0x0/0x1bfc00000, data 0x12c3e49/0x14f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8f7de000 session 0x562f8c3e41e0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8dfc8c00 session 0x562f8f7f23c0
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 546414592 unmapped: 95854592 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 ms_handle_reset con 0x562f8b7a8400 session 0x562f8c03a960
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540844032 unmapped: 101425152 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540852224 unmapped: 101416960 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540860416 unmapped: 101408768 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540868608 unmapped: 101400576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540868608 unmapped: 101400576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540868608 unmapped: 101400576 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540884992 unmapped: 101384192 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540893184 unmapped: 101376000 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 101367808 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540909568 unmapped: 101359616 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540917760 unmapped: 101351424 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540917760 unmapped: 101351424 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540917760 unmapped: 101351424 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540917760 unmapped: 101351424 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540917760 unmapped: 101351424 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540925952 unmapped: 101343232 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540925952 unmapped: 101343232 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540925952 unmapped: 101343232 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540934144 unmapped: 101335040 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540950528 unmapped: 101318656 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540958720 unmapped: 101310464 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540983296 unmapped: 101285888 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540991488 unmapped: 101277696 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 101269504 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 101269504 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 101261312 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541016064 unmapped: 101253120 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541032448 unmapped: 101236736 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 101228544 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541040640 unmapped: 101228544 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541048832 unmapped: 101220352 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541057024 unmapped: 101212160 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 79K writes, 317K keys, 79K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.05 MB/s#012Cumulative WAL: 79K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2654 writes, 10K keys, 2654 commit groups, 1.0 writes per commit group, ingest: 11.22 MB, 0.02 MB/s#012Interval WAL: 2654 writes, 1071 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541073408 unmapped: 101195776 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541073408 unmapped: 101195776 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541081600 unmapped: 101187584 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541081600 unmapped: 101187584 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541081600 unmapped: 101187584 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541089792 unmapped: 101179392 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541089792 unmapped: 101179392 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541097984 unmapped: 101171200 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541097984 unmapped: 101171200 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541106176 unmapped: 101163008 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541114368 unmapped: 101154816 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541122560 unmapped: 101146624 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541130752 unmapped: 101138432 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541138944 unmapped: 101130240 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa0000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315496 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541138944 unmapped: 101130240 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 186.972534180s of 187.068466187s, submitted: 37
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541155328 unmapped: 101113856 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541155328 unmapped: 101113856 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541245440 unmapped: 101023744 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541286400 unmapped: 100982784 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541302784 unmapped: 100966400 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541310976 unmapped: 100958208 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 100950016 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541319168 unmapped: 100950016 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:18.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541335552 unmapped: 100933632 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541343744 unmapped: 100925440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541343744 unmapped: 100925440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541343744 unmapped: 100925440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541343744 unmapped: 100925440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541343744 unmapped: 100925440 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541351936 unmapped: 100917248 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541351936 unmapped: 100917248 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541360128 unmapped: 100909056 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541368320 unmapped: 100900864 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541368320 unmapped: 100900864 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541368320 unmapped: 100900864 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541368320 unmapped: 100900864 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541368320 unmapped: 100900864 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541376512 unmapped: 100892672 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541376512 unmapped: 100892672 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541376512 unmapped: 100892672 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541376512 unmapped: 100892672 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541384704 unmapped: 100884480 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541384704 unmapped: 100884480 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 100876288 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 100876288 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 100876288 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 100876288 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541392896 unmapped: 100876288 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541401088 unmapped: 100868096 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541409280 unmapped: 100859904 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541417472 unmapped: 100851712 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541417472 unmapped: 100851712 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541417472 unmapped: 100851712 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541425664 unmapped: 100843520 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541425664 unmapped: 100843520 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541425664 unmapped: 100843520 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541425664 unmapped: 100843520 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541433856 unmapped: 100835328 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541442048 unmapped: 100827136 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541450240 unmapped: 100818944 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541458432 unmapped: 100810752 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541466624 unmapped: 100802560 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541483008 unmapped: 100786176 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541491200 unmapped: 100777984 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541499392 unmapped: 100769792 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541499392 unmapped: 100769792 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541507584 unmapped: 100761600 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541507584 unmapped: 100761600 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541507584 unmapped: 100761600 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541507584 unmapped: 100761600 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541507584 unmapped: 100761600 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541515776 unmapped: 100753408 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541523968 unmapped: 100745216 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541532160 unmapped: 100737024 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541540352 unmapped: 100728832 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541540352 unmapped: 100728832 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541540352 unmapped: 100728832 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541540352 unmapped: 100728832 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541540352 unmapped: 100728832 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541548544 unmapped: 100720640 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541556736 unmapped: 100712448 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541564928 unmapped: 100704256 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541581312 unmapped: 100687872 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541597696 unmapped: 100671488 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541605888 unmapped: 100663296 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541614080 unmapped: 100655104 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541630464 unmapped: 100638720 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541630464 unmapped: 100638720 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541630464 unmapped: 100638720 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541638656 unmapped: 100630528 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541646848 unmapped: 100622336 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541646848 unmapped: 100622336 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541663232 unmapped: 100605952 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541663232 unmapped: 100605952 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541671424 unmapped: 100597760 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541679616 unmapped: 100589568 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541679616 unmapped: 100589568 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541679616 unmapped: 100589568 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 100581376 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 100581376 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 100581376 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 100581376 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541687808 unmapped: 100581376 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541704192 unmapped: 100564992 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 100556800 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 100556800 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 100556800 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 100556800 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541712384 unmapped: 100556800 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541720576 unmapped: 100548608 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541720576 unmapped: 100548608 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541720576 unmapped: 100548608 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541728768 unmapped: 100540416 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541728768 unmapped: 100540416 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541728768 unmapped: 100540416 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541728768 unmapped: 100540416 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541736960 unmapped: 100532224 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541745152 unmapped: 100524032 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541745152 unmapped: 100524032 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541745152 unmapped: 100524032 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541745152 unmapped: 100524032 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541753344 unmapped: 100515840 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541761536 unmapped: 100507648 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541769728 unmapped: 100499456 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541777920 unmapped: 100491264 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541786112 unmapped: 100483072 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541786112 unmapped: 100483072 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541786112 unmapped: 100483072 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541794304 unmapped: 100474880 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541794304 unmapped: 100474880 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541794304 unmapped: 100474880 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541794304 unmapped: 100474880 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541810688 unmapped: 100458496 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541810688 unmapped: 100458496 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541818880 unmapped: 100450304 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: osd.2 418 heartbeat osd_stat(store_statfs(0x199aa1000/0x0/0x1bfc00000, data 0xcd2e16/0xefd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2525f9c6), peers [0,1] op hist [])
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 541859840 unmapped: 100409344 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: bluestore.MempoolThread(0x562f88305b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315320 data_alloc: 218103808 data_used: 2916352
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'config diff' '{prefix=config diff}'
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'config show' '{prefix=config show}'
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'counter dump' '{prefix=counter dump}'
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'counter schema' '{prefix=counter schema}'
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540966912 unmapped: 101302272 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: prioritycache tune_memory target: 4294967296 mapped: 540876800 unmapped: 101392384 heap: 642269184 old mem: 2845415832 new mem: 2845415832
Jan 20 10:53:18 np0005588920 ceph-osd[79820]: do_command 'log dump' '{prefix=log dump}'
Jan 20 10:53:18 np0005588920 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:53:18 np0005588920 nova_compute[226886]: 2026-01-20 15:53:18.674 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:18 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:18 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:18 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:18.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:18 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 20 10:53:18 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/149186993' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/440145304' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3403921456' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 20 10:53:19 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/947707734' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 20 10:53:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:20.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:20 np0005588920 nova_compute[226886]: 2026-01-20 15:53:20.311 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 20 10:53:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4011182796' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 20 10:53:20 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:20 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:20 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:20.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:20 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 20 10:53:20 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/685726112' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/952246884' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4162090687' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1793993722' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 20 10:53:21 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728917619' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3519584347' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 20 10:53:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:22.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2141415677' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/454954369' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2492913358' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:22 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:22 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:22.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 20 10:53:22 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/722851636' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 20 10:53:22 np0005588920 systemd[1]: Starting Hostname Service...
Jan 20 10:53:23 np0005588920 systemd[1]: Started Hostname Service.
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3368371534' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/708267248' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2239306666' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4113140527' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 20 10:53:23 np0005588920 nova_compute[226886]: 2026-01-20 15:53:23.677 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 20 10:53:23 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3271178705' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 20 10:53:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:24.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:24 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 20 10:53:24 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/134214396' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 20 10:53:24 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:24 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:24 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:24.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:25 np0005588920 nova_compute[226886]: 2026-01-20 15:53:25.313 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1694053069' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:25 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/539377997' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 20 10:53:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:26.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1811606035' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 20 10:53:26 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/331790073' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 20 10:53:26 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:26 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:26 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:26.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:27 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1290319706' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 20 10:53:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:28 np0005588920 nova_compute[226886]: 2026-01-20 15:53:28.721 226890 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:28 np0005588920 ceph-mon[77148]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:28 np0005588920 radosgw[83324]: ====== starting new request req=0x7f0a18df96f0 =====
Jan 20 10:53:28 np0005588920 radosgw[83324]: ====== req done req=0x7f0a18df96f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:28 np0005588920 radosgw[83324]: beast: 0x7f0a18df96f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:28.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
